Apple has expressed significant pride in the privacy measures associated with Apple Intelligence, so much so that the company is offering substantial financial rewards to those who can identify privacy issues or potential attack vectors within its code. Apple’s initial bug bounty program for its AI features includes a reward of $50,000 for discovering any unintentional data disclosure, with a top reward of $1 million for a successful remote attack on its cloud processing system.
The company first introduced its Private Cloud Compute in June, alongside a demonstration of new AI features set to roll out to iOS, iPadOS, and eventually macOS. A key enhancement involves Siri’s ability to integrate across various apps. For example, Siri can extract data from text messages, such as a reminder about a cousin’s birthday, and then gather additional details from emails to create a calendar event. This functionality requires processing data through Apple’s internal cloud servers, where user data, which many prefer to keep private, is stored and processed.
To uphold its reputation for stringent privacy standards, Apple asserts that Private Cloud Compute adds both software and hardware security layers, ensuring data security without storing user data in a retrievable format.
In a recent blog post, Apple’s security team extended an invitation to all security researchers and individuals with technical curiosity to independently verify these privacy claims. Although Apple has previously allowed third-party audits, this marks the first public opportunity to scrutinize the system. Apple provides a security guide and access to a virtual research environment for evaluating the Private Cloud Compute feature within the macOS Sequoia 15.1 developer preview. This requires a Mac equipped with an M-series chip and at least 16 GB of RAM. Additionally, Apple has made the cloud compute source code available in a GitHub repository.
Apple is offering various bounty rewards for identifying bugs or security vulnerabilities. While $50,000 is offered for discovering accidental data disclosure, up to $250,000 is available for findings related to access to user request data or sensitive user information. The highest reward of $1 million is designated for identifying arbitrary code execution with arbitrary entitlements.
This initiative highlights Apple’s confidence in its system, while also providing a broader opportunity for scrutiny of Apple’s cloud processes. The rollout of iOS 18.1 is scheduled for October 28, with a beta version of iOS 18.2 already available, featuring integrations with ChatGPT. Users are required to grant permission before ChatGPT can access their requests or interact with Siri. OpenAI’s chatbot serves as a temporary solution until Apple fully develops its own AI.
Apple underscores its robust privacy record, despite occasionally monitoring users within its own software environments. In the case of Private Cloud Compute, Apple insists it lacks the capability to review logs or requests made to Siri, a claim that could be verified by those who access the source code, as Siri’s upgrade is anticipated sometime in 2025.