On the last day of Black Hat 2019, I attended an interesting session where Apple provided a peek behind the curtain on macOS and iOS security, as well as finally announced an expansion to Apple’s bug bounty program and its new iOS Security Research Devices.
Bug bounties and responsible disclosure have become such an integral part of the security world these days, that Apple was doing itself and its customers a disservice for previously offering a very limited version.
So, Apple explained how its bug bounty program would work going forward, what its iOS Security Research Device program is, as well as offering a deep dive into macOS and iOS security updates. Head of security engineering and architecture, Ivan Krstic went over how Apple ensures macOS secure boot remains secure, how iOS 13 Find My data is private, and how iOS Kernel Integrity Protection continues to evolve.
macOS secure boot
Ivan reviewed the macOS secure boot initiated via the T2 Security Chip, which makes sure users know their Mac is in a trusted state. He focused on how Apple has worked to ensure secure boot protects against malicious software from executing during startup via PCIe Bus 0, which starts early into the boot-up process, making it vulnerable.
Apple has continually worked to make VT-d (Intel Virtualization Technology for Directed I/O) run earlier and earlier in the boot-up process to protect against malicious use of Thunderbolt and PCIe direct memory access (DMA). VT-d helps restrict DMA on Mac to prevent malicious code from gaining kernel access. The T2 Security Chip verifies the UEFI firmware, which has two states: pre-RAM and post-RAM. Pre-Ram is when PCIe Bus 0 initializes, leaving the system vulnerable, so Apple now has VT-d run page tables that “deny all” connections during this period. (Ivan noted that the page tables are literally just a bunch of zeroes.)
Another macOS security change that Ivan discussed was the Option ROM (OROM) Sandbox, which Apple designed to help against EFI exploits. Apple moved OROM into separate virtual memory spaces, limiting it to only talking with devices within each sandbox. To prevent against privilege escalation via malicious code, all OROM operate in Ring 3 instead of Ring 0.
iOS security updates
Ivan provided a look at some iOS security features, such as the upcoming Find My in iOS 13 and the iOS Kernel Integrity Protection.
iOS Find My
This new feature was announced as part of the upcoming iOS 13 release and combines Find My iPhone and Find My Friends into one app. The app helps users track down their lost devices using Bluetooth and location data collected from other Apple devices (the slide said “participating strangers,” so presumably users have to opt-in).
It seemed to me a little questionable from a privacy standpoint when originally announced at WWDC, so thankfully, Ivan outlined how Apple keeps user data private and secure. First and foremost, location reports and finder identities are not accessible to Apple servers and cannot be read or modified. “What’s amazing is that this whole interaction is end-to-end encrypted and anonymous,” Craig Federighi told WWDC attendees.
The reported location is encrypted in iCloud Keychain and only the owner can track a lost device, which requires a second Apple device. How it works is that the two devices generate a keypair that only the second device can decrypt. The devices communicate with other Apple devices via Bluetooth, which report the location and a timestamp that is valid for 15 minutes. The second device queries the database for location reports, and then the database provides an encrypted response that only the second device can decrypt.
Kernel Integrity Protection
Apple wants to maintain integrity of kernel code and read-only data following secure boot. The initial version was available in iOS 9 and engineers learned that they must protect critical data (i.e., page tables, sandbox config, etc.) and integrity verification was vulnerable to race conditions. Apple found it was easier to adjust the hardware architecture to improve security, including features like read-only data integrity and prevention of kernel memory modification.
Bug bounty and iOS Security Research Devices
Apple made a couple announcements at the very end of talk, officially announcing that it was expanding the previously limited bug bounty program and unveiling an iOS Security Research Device program. While the announcements were met with applause, it was likely muted by Forbes reporting on this exact news days ahead of the presentation.
Bug bounty program
While many tech companies have offered bug bounties for years, Apple dragged their feet. It wasn’t until 2016 that Apple even started a bug bounty program, and even then, it was only for iOS and iCloud.
Well, all of that has changed, as Apple will now extend its bug bounty program to include all Apple OSes, including macOS (kind of a shock to learn there was only an iOS bug bounty before). Previously open only to those invited, now any researcher can submit discovered vulnerabilities. What they can earn will depend upon the category of vulnerability, with a maximum payout of $100,000 for unauthorized access to iCloud account data on Apple servers on up to $1 million for a full chain kernel code execution attack. Additionally, bugs found on “designated pre-release builds” are eligible for a 50% bonus.
iOS Security Research Device program
Apple will provide a limited unlocked device to select security researchers. Forbes described these iPhones as “lite” versions, indicating that they are less open than what Apple’s own security team would use. The devices do come with SSH, a root shell, and advanced debug capabilities.
A new Apple?
Most of the attention from the Black Hat session was on the bug bounty and research device programs, and it’s not hard to see why. Apple has long been notoriously closed off, preferring to rely on its internal developers when it comes to handling security vulnerabilities. But, bug bounties have become integral to the security industry, offering someone a way to legitimately earn money when discovering a vulnerability rather than publishing it elsewhere for someone to take advantage. Additionally, the device program should make it easier on researchers to ensure they completely understand an iOS device. Basically, it’s nice to see some outward gesture to third-party developers—even if this took longer than it should have.