In a recent ZDNET article, my friend and colleague David Gewirtz explains why he considers the upcoming iPhone 16, with its focus on iOS 18 and Apple Intelligence, an essential upgrade.
While I value David’s perspective, I beg to differ.
Also: 6 reasons why iOS 18 makes the iPhone 16 a must-upgrade for me
David argues that the incorporation of artificial intelligence (AI) in iOS 18 makes the iPhone 16 a necessary upgrade for him, emphasizing the potential of Apple Intelligence to revolutionize our interaction with our devices. While I agree with his view in the long term, I’m not convinced that the first version of Apple Intelligence will represent the big leap forward in usability that so many people are anticipating.
Every year, my wife and I eagerly await the release of the new iPhone. Being part of Apple’s Upgrade Program, we return our devices, reset our loan with Citizens Bank, and acquire the latest model. Over the past few years, I have opted for the Pro Max, and my wife has chosen the base model. The annual expected improvements have been incremental but appreciated.
However, despite the buzz around iPhone 16’s new features and the integration of Apple Intelligence, several concerns dampen my enthusiasm for upgrading.
What they aren’t telling us about Apple Intelligence
Apple Intelligence represents a significant leap in on-device AI capabilities, directly bringing advanced machine learning and natural language processing to our phones. However, this technology is still in its infancy. On-device LLMs and generative AI are essentially in an alpha or beta phase, and there’s a lot of uncertainty about how well they will perform on current Apple mobile hardware.
David views the integration of AI in iOS 18 as a significant leap forward. But let’s not kid ourselves. These on-device AI features are in their infancy, which means they might not deliver the seamless experience that Apple users have come to expect. When Apple Intelligence is launched to the public in the fall of 2024, it will still be considered beta, not a finished product.
It must be noted that Apple Intelligence is not simply another random or routine iOS or even MacOS feature upgrade. The device will load a downsized version of Apple’s Foundation Models, a home-grown large language model (LLM), which will be several gigabytes in size and have as many as 3 billion parameters. (Compare that to the hundreds of billions of parameters used by models like GPT-3.5 and GPT-4 — or what Apple will run in its data centers for its “Private Cloud Compute” feature of Apple Intelligence.)
Also: Apple Intelligence will improve Siri in 2024, but don’t expect most updates until 2025
How this will work on iOS, iPadOS, and MacOS has not yet been fully detailed to developers, but it will have to be loaded — at least partially — in memory, potentially occupying between 750MB and 2GB of RAM when running, according to current estimates, depending on how good Apple’s memory compression technology is and other factors.
That’s a substantial amount of memory allocated to a core OS function that won’t always be used. As a result, parts of it will have to be dynamically loaded in and out of memory as needed, adding new system constraints for applications and potentially putting additional stress on the CPU.
The current iPhone hardware doesn’t cut it
Earlier this month, I discussed how older — as well as current generation — iOS devices aren’t powerful enough to handle on-device Generative AI tasks. The base iPhone 15, which has only 6GB of RAM, may struggle to meet the demands of Apple Intelligence as it evolves and becomes more integrated into iOS, core Apple applications, and developer applications. Older iPhones have 6GB of RAM or less.
The iPhone 15 Pro, with 8GB of RAM, may be better suited for these tasks. It is the only iOS device developers can use to test Apple Intelligence (besides their Macs and iPad Pros) before the iPhone 16 ships, presumably in October. However, many end users may still experience suboptimal performance on an 8GB device when Apple Intelligence is fully implemented.
Also: The best phones you can buy: Expert tested
Early adopters may find the AI features more useful for developers than everyday users, as the system may need fine-tuning and updates to reach its full potential. I also expect that, like the base iPhone 15 and earlier iPhone owners who won’t have access to it when they upgrade to iOS 18, Apple Intelligence will be a feature that end users can simply turn off, saving their memory for application use.
The upcoming iPhone 16, despite possibly having more advanced hardware, may also struggle with the new AI capabilities due to design cycles that did not account for these features. It may take another product cycle or two before the hardware fully aligns with the new AI capabilities to be rolled out in iOS 18 and beyond. As a result, users may experience suboptimal performance and a less seamless user experience.
Why you shouldn’t buy the iPhone 16 for Apple Intelligence
For these reasons, I see the iPhone 16 (and potentially even the iPhone 17) as transitional products in Apple’s journey with on-device AI.
In addition to other silicon optimizations, future iPhones will likely require more RAM to fully support these AI features, which could lead to increased costs. If the base iPhone 16 needs 8GB of RAM to run Apple Intelligence effectively, the starting price could be pushed to $899 or higher. The Pro models might require 12GB or even 16GB of RAM, increasing the price. This would also mean a new A18 chip for the Pro models, while the base iPhone 16 might only get the current A17 – although perhaps an “A17X” with 10GB might be built to give the phone more memory breathing room.
Also: Every iPhone model that will support Apple’s upcoming AI features (for now)
Besides memory concerns, AI processing demands a lot of power and additional computing resources. Without significant advancements in battery and power management technology, users might have to charge their phones more often. This can lead to increased battery drain, reduced battery lifespan, and potential performance issues. The extra processing power needed to run on-device LLMs could strain the CPU, causing the device to heat up and affecting its overall performance and reliability.
How Apple Intelligence will likely evolve
Apple’s AI capabilities are expected to improve significantly in the coming years. By 2025, we may see more advanced and dependable integration of Apple Intelligence not only on mobile devices and Macs, but also on products like the Apple Watch, HomePod, Apple TV, and a consumer-oriented version of the Vision headset.
To extend Apple Intelligence to these less powerful devices, as the company is doing with its “Private Cloud Compute” by running secure Darwin-based servers in their data centers for more advanced LLM processing, Apple might leverage cloud-based resources for these less-powerful systems through fully developed data center capabilities and partnerships with companies like OpenAI or Google.
Also: To rescue the Vision Pro, Apple must do these 3 things
Alternatively, they could consider a distributed or “mesh” AI processing system, where idle devices within a household or enterprise can assist less powerful ones with LLM queries.
Apple could achieve this by equipping MacOS 15 Sequoia, iOS 18, and iPadOS 18 with Apple Intelligence and the on-device LLM as planned. Subsequent changes to iCloud, iOS, iPadOS, and MacOS could enable all devices to communicate their generative AI capabilities and idle processing state. This would allow them to act as proxies for each other’s Apple Intelligence requests.
Enterprises may also employ a mobile device management solution to facilitate access to on-device LLMs with business Macs. Additionally, iPhones or Macs could be used as proxies for Apple Watch or HomePod requests for mobile users. We may also see a more powerful Apple TV with more onboard memory and processing to act as an Apple Intelligence “hub” for every Apple device used in a household.
Imagine your iPhone utilizing the unused processing power of your Mac or iPad, all equipped with on-device LLMs, to tackle complex AI tasks. This would increase the accessibility of AI features across Apple’s product range.
Key points to consider before upgrading to the iPhone 16
Apple’s AI features are practically beta: Apple Intelligence is still in its infancy and might not provide the seamless experience Apple users expect. Apple Intelligence’s potential will be realized in future iterations with more mature hardware and software optimizations.
Hardware limitations: The iPhone 16 may struggle with Apple Intelligence’s demands due to design cycles that didn’t initially account for these features. The iPhone 16 is a transitional product, and it will likely take another product cycle or two before the hardware fully aligns with the new AI capabilities.
Battery and performance concerns: AI processing is power-hungry and could lead to increased battery drain and performance issues.
Broader enhancements: Consider the improvements in camera quality, display, and overall performance, not just the AI capabilities.
But I’m still optimistic
Despite the hype around Apple Intelligence, many other reasons exist to consider upgrading to the iPhone 16. Improvements in camera quality, display, and overall performance are still worth noting. The iPhone 16 will likely feature better sensors, enhanced computational photography, and superior video capabilities. The display might also see improvements in brightness, color accuracy, and refresh rate, making it a better device for media consumption and gaming.
Also: The 3 Apple products you shouldn’t buy this month (including this iPad)
However, if you’re considering the iPhone 16 solely for its AI capabilities — which are still evolving and unlikely to deliver the expected performance touted in the WWDC 2024 keynote — you might want to manage your expectations.
+ There are no comments
Add yours