Why you shouldn’t buy the iPhone 16 for Apple Intelligence

Estimated read time 8 min read


iPhone 15 Pro surrounded by other phones

Prakhar Khanna/ZDNET

In a recent ZDNET article, my friend and colleague David Gewirtz explained why he considers the upcoming iPhone 16, with its focus on iOS 18 and Apple Intelligence, an essential upgrade. While I value David’s perspective, I have a different take.

Also: The iOS 18 public beta is available for iPhone right now. How to download it

David argues that the incorporation of artificial intelligence (AI) in iOS 18 makes the iPhone 16 a necessary upgrade, emphasizing the potential of Apple Intelligence to revolutionize our interaction with devices. While I agree that Apple Intelligence has long-term potential, I’m not convinced that its first iteration will deliver the game-changing usability that many anticipate.

If anything, the latest iOS 18.1 betas with Apple Intelligence features have been underwhelming at best.

The annual upgrade ritual

Every year, my wife and I eagerly await the release of the new iPhones. Being part of Apple’s Upgrade Program, we return our devices, reset our loan with Citizens Bank, and acquire the latest model. Over the past few years, I have opted for the Pro Max, and my wife has chosen the base model. The expected annual improvements have been incremental but appreciated. 

Despite the buzz around the iPhone 16’s new features and the integration of Apple Intelligence, however, several concerns dampen my enthusiasm for upgrading this year.

Apple Intelligence: A significant, yet incomplete, leap forward

Apple Intelligence represents a significant leap in on-device AI capabilities, directly bringing advanced machine learning and natural language processing to our phones. Unlike typical iOS or MacOS feature upgrades, Apple Intelligence loads a downsized version of Apple’s Foundation Models, a home-grown large language model (LLM) with approximately 3 billion parameters. While impressive, this is tiny compared to models like GPT-3.5 and GPT-4, which boast hundreds of billions of parameters. Even Meta’s open source Llama 3, which you can run on a desktop computer, has 8 billion parameters. 

Also: I broke Meta’s Llama 3.1 405B with one question (which GPT-4o gets right)

The iOS 18.1 Developer Beta, released at the end of July, introduced the first Apple Intelligence features, but they’ve been modest so far. These include Writing Tools, which allow users to rewrite emails, texts, or letters in different tones, proofread content, and summarize or format it with tables or bullet points. Another feature provides webpage, email, or text summaries — a time-saver that cuts straight to the content. Users can also ask the Photo app to search for specific images, such as those showing someone holding a phone.

However, the integration of Apple Intelligence is not without challenges. The model, when running, may occupy between 750MB and 2GB of RAM, depending on how effective Apple’s memory compression technology is. This substantial allocation of memory to a core OS function that won’t always be used means that parts must be dynamically loaded in and out of memory as required, introducing new system constraints and potentially putting additional stress on the CPU.

Also: How to run dozens of AI models on your Mac or PC – no third-party cloud needed

More advanced AI features, such as Genmoji for creating custom emoji, the Image Playground for on-device image creation, and ChatGPT integration, have yet to be included in the beta. Siri has received a minor UI update and a more conversational tone, but significant improvements are expected later this year. According to Bloomberg’s Mark Gurman, additional Apple Intelligence features will be introduced in upcoming iOS 18.1 Beta releases, with the stable version possibly arriving in October.

New hardware leaks: what to expect from the iPhone 16

Earlier, I discussed how older — as well as current generation — iOS devices aren’t powerful enough to handle on-device Generative AI tasks. The base iPhone 15, which has only 6GB of RAM, would struggle to meet the demands of Apple Intelligence as it evolves and becomes more integrated into iOS, core Apple applications, and developer applications. Older iPhones have 6GB of RAM or less, and are not eligible to run Apple Intelligence in current iOS 18.1 builds. 

Due to their 8GB of onboard RAM, only the iPhone 15 Pro and the iPhone 15 Pro Max can run Apple Intelligence, and it is strictly a beta feature that can be enabled or disabled at user preference within Settings.

Also: iOS 18.1 update: Every iPhone model that will support Apple’s new AI features (for now)

Recent leaks have spilled the specifications of the iPhone 16 lineup, providing a clearer picture of what to expect. The base iPhone 16 is expected to feature the A18 processor with 8GB of RAM, while the iPhone 16 Pro models might sport the A18 Pro with up to 12GB of RAM. This increase in memory is crucial, considering the demanding nature of Apple Intelligence features. However, whether it will fully address performance concerns remains to be seen.

Interestingly, despite these hardware upgrades, Apple appears to be keeping prices similar to those of the iPhone 15 series, with the base iPhone 16 starting at $799. The iPhone 16 Pro, however, is rumored to start at $1,099, a $100 increase from its predecessor, likely due to the additional storage and upgraded components. The Pro models are also expected to introduce Wi-Fi 7 connectivity, a new telephoto lens, and larger screens — 6.3 inches for the Pro and 6.9 inches for the Pro Max.

Also: Apple Intelligence will improve Siri in 2024, but don’t expect most updates until 2025

Despite these upgrades, the iPhone 16 may still face challenges due to design cycles that didn’t fully account for the scope of Apple Intelligence’s capabilities. As a result, users may experience suboptimal performance and a less seamless user experience, especially as more AI features roll out in subsequent updates.

Why you shouldn’t buy the iPhone 16 for Apple Intelligence

Besides memory concerns, AI processing demands a lot of power and additional computing resources. Without significant advancements in battery and power management technology, users might have to charge their phones more often. This can lead to increased battery drain, reduced battery lifespan, and potential performance issues. The extra processing power needed to run on-device LLMs could strain the CPU, causing the device to heat up and affecting its overall performance and reliability.

Also: How iOS 18 changes the way you charge your iPhone

For these reasons, I see the iPhone 16 (and potentially even the iPhone 17) as a transitional product in Apple’s journey toward on-device AI. 

How Apple Intelligence will likely evolve 

Apple’s AI capabilities are expected to improve significantly in the coming years. By 2025, we may see more advanced and dependable integration of Apple Intelligence not only in mobile devices and Macs, but also in products like the Apple Watch, HomePod, Apple TV, and a consumer-oriented version of the Vision headset.

To extend Apple Intelligence to these less powerful devices, as the company is doing with its “Private Cloud Compute” by running secure Darwin-based servers in their data centers for more advanced LLM processing, Apple might leverage cloud-based resources for these less-powerful systems through fully developed data center capabilities and partnerships with companies like OpenAI or Google.

Also: To rescue the Vision Pro, Apple must do these 3 things

Alternatively, Apple could consider a distributed or “mesh” AI processing system, where idle devices within a household or enterprise can assist less powerful ones with LLM queries.

Apple could achieve this by equipping MacOS, iOS, and iPadOS with Apple Intelligence and the on-device LLM as planned. Subsequent changes could enable all devices to communicate their generative AI capabilities and idle processing state. This would allow them to act as proxies for each other’s Apple Intelligence requests. 

Enterprises may also employ a mobile device management solution to facilitate access to on-device LLMs with business Macs. Additionally, iPhones or Macs could be used as proxies for Apple Watch or HomePod requests for mobile users. We may also see a more powerful Apple TV with more onboard memory and processing to act as an Apple Intelligence “hub” for every Apple device in a household.

Imagine your iPhone using the unused processing power of your Mac or iPad, all equipped with on-device LLMs, to tackle complex AI tasks. This would increase the accessibility of AI features across Apple’s product range.

I’m still optimistic

Despite the hype around Apple Intelligence, there are many other reasons to consider upgrading to the iPhone 16, like improvements in camera quality, display, and overall performance. The iPhone 16 will likely feature better sensors, enhanced computational photography, and superior video capabilities. The display might also see improvements in brightness, color accuracy, and refresh rate, making it a better device for media consumption and gaming. 

Also: Apple may be cooking something big with its new Game Mode. Here are 3 things we know

If, however, you’re considering the iPhone 16 solely for its AI capabilities — which are still evolving and unlikely to deliver the expected performance touted in Apple’s WWDC 2024 keynote — you might want to manage your expectations.

This article was originally published on June 28, 2024, and updated on August 27, 2024. 





Source link

You May Also Like

More From Author

+ There are no comments

Add yours