Apple Dares Anyone to Find a Problem With Its Darling AI, Offers $1 Million Bounty

Estimated read time 3 min read


Apple is very proud of the privacy apparatus surrounding Apple Intelligence, so proud that it’s offering princely sums to anyone who finds any privacy issue or attack vector in its code. Apple’s first bug bounty program for its AI is offering a hefty sum of $50,000 for anybody who finds any accidental data disclosure, but the real prize is $1 million for a remote attack on Apple’s newfangled cloud processing.

Apple first announced its Private Cloud Compute back in June, at the same time it detailed all the new AI features coming to iOS, iPadOS, and, eventually, MacOS. The most important aspect of Apple’s AI was the reinvigorated Siri that’s capable of working across apps. As presented, Siri could go into your texts to pull up some information about a cousin’s birthday your mom sent you, then pull extra information from your emails to make a calendar event. This also required processing the data through Apple’s internal cloud servers. Apple, in turn, would be managing a treasure trove of user data that most people would want kept private.

To keep up its reputation as a stickler for privacy, Apple says that Private Cloud Compute is an extra layer of both software and hardware security. Simply put, Apple claims your data will be secure, and that it won’t—and can’t—retain your data.

Which brings us to the security bounty program. In a Thursday blog post, Apple’s security team said it’s inviting “all security researchers—or anyone with interest and a technical curiosity… [to] perform their own independent verification of our claims.”

So far, Apple said it has allowed third-party auditors inside to root around, but this is the first time it’s opening it up for the public. It supplies a security guide and access to a virtual research environment to analyze PCC inside the macOS Sequoia 15.1 developer preview. You’ll need a Mac with an M-series chip and at least 16 GB of RAM to access. The Cupertino company is supplying the cloud compute source code in a Github repository.

Beyond calling all hackers and script kiddies to the table, Apple is offering a wide variety of payouts for any bugs or security issues. The base $50,000 is only for “accidental or unexpected data disclosure” but you could get a sweet $250,000 for “access to users’ request data or sensitive information about the users’ request.” The top $1 million bounty is for “arbitrary code execution with arbitrary entitlements.”

It’s indicative of how confident Apple is in this system, but at the very least the open invite could allow more people to go under the hood with Apple’s cloud processes. The initial rollout of iOS 18.1 is set to hit iPhones on Oct. 28. There’s already a beta for iOS 18.2 which gives users access to the ChatGPT integration. Apple forces users to grant permission to ChatGPT before it can see any of your requests or interact with Siri. OpenAI’s chatbot is merely a stopgap before Apple has a chance to get its own AI fully in place.

Apple touts its strong track record on privacy issues, though it has a penchant for tracking users within its own software ecosystems. In PCC’s case, Apple is claiming it won’t have any ability to check your logs or requests with Siri. Perhaps anybody accessing the source code can fact-check the tech giant on its privacy claims before Siri finally gets her upgrade, likely sometime in 2025.



Source link

You May Also Like

More From Author

+ There are no comments

Add yours