A Billion Pixels a Second: I Got a Rare Look Inside Apple’s Secret iPhone 16 Camera Labs

Estimated read time 14 min read


I’m standing on a wire mesh floor in a secluded room at Apple’s headquarters in Cupertino, California. The “floor” I’m on is suspended over a pit filled with 4-foot foam wedges. The walls and ceiling in the room are also covered with large triangle-cuts of foam that remove virtually any echo. When I clap there’s just the muffled sound from the impact of my hands. It’s oh so quiet. If you’re into the whole sensory deprivation thing, you’d be right at home in this strange room.

It’s called a long wave anechoic chamber, and it’s where Apple tests and calibrates the microphones on the iPhone 16. Earlier in December, I got a rare inside look at the testing chamber along with several other secret labs that Apple uses to test and calibrate the iPhone 16’s audio and video features.

This gif shows us entering the Apple’s anechoic chamber where it tests the iPhone 16.

Celso Bulgatti/CNET

For years the iPhone’s ability to record videos with excellent image quality has been a high watermark when it comes to phones. Notable directors like Steven Spielberg, Zach Snyder, Steven Soderbergh and Rian Johnson have shot feature length movies, music videos or short films entirely with an iPhone

But the iPhone’s dominance in terms of its video prowess has been threatened recently as competitors like Samsung catch up. 4K video recorded with the Galaxy S24 Ultra looks incredible in terms of image quality and versatility. In 2023, Google launched its Video Boost feature on the Pixel 8 Pro that processes video in the cloud after you record it to improve how it looks, even in low light.

That was all before the iPhone 16 Pro dropped. In my review I appreciated its new slow-motion video capture, superb speakers and camera upgrades, and I’m not the only one.

Watch this: I Got an Inside Look at Apple’s iPhone 16 Audio and Video Labs

In a CNET survey from August, 38% of people said that better cameras are a main motivation for buying a new phone. At a time where more people are recording videos with their phones than ever, not just Hollywood directors, Apple took the iPhone’s video capabilities to another level by bringing parity between regular videos and slow motion in terms of image quality and dynamic range. It also complimented the iPhone’s support for spatial audio by introducing a new Audio Mix feature which allows iPhone owners to adjust the quality of the audio on the videos they record. You can make it seem like your subject is wearing a lavalier mic or in a professional recording studio all by moving around a few sliders in the Photos app.

It is fascinating to see the amount of time, testing and calibration that Apple poured into the iPhone 16 series’ cameras and microphones and developing unique new features like Audio Mix.

Apple’s anechoic chamber transforms the iPhone’s mics 

Pairs of foam wedges alternating in 90 degree positions sticking out of a wall

A portion of one of the walls in the anechoic chamber.

Celso Bulgatti/CNET

The first stop on my behind-the-scenes tour was that anechoic chamber with all the foam. This chamber makes a library sound as loud as a subway station by comparison. It’s where the iPhone 16’s microphones get tested and characterized which drives further development down the chain in terms of audio. Apple, like all phone makers, has the challenge of making something that can fit in your pocket and that can also capture pristine sounds and play them back the way you heard them.

“The iPhone is such a ubiquitous recording device and gets used in so many different environments that we want to make sure that we’re able to capture the memory that our users are trying to capture in the truest form,” Ruchir Dave, senior director, acoustics engineering at Apple, told me on the tour.

The iPhone 16 has four microphones. But compared to a regular mic, like a lavalier that a newscaster might wear, the ones on the iPhone are tiny. So in order to get them to pick up the higher quality sound of a much larger microphone, or one that’s pinned to someone’s shirt, Apple had to do some clever engineering which started by testing what the iPhone 16’s mics actually pick up.

An iPhone 16 Pro without its back showing the internals including four mics

The iPhone 16 Pro has four microphones (the gold rectangles).

Apple

“The approach we took was to go after both quality as well as utility. And as part of that, we developed a novel microphone component that allows us to deliver some of the best acoustic performance in a phone product,” said Dave. “At the same time, [we] developed a feature like Audio Mix that gives users the flexibility to be able to capture different sounds and gives you that creative freedom in the edit to adjust it how you like.”

And that brings me back to the anechoic chamber for a chime test. There is an array of speakers (roughly two dozen) mounted on a pipe in the shape of an arc that goes from under the wire mesh floor to the ceiling of the chamber. The speakers play a series of chimes and engineers measure what the iPhone 16 Pro’s mics pick up. The phone, which is mounted on a stand atop a turnable base, rotates a few degrees clockwise and the chimes play again. This continues until the iPhone has rotated in a complete circle.

A room with large triangle cuts of foam on the walls, ceiling and floor

The anechoic chamber I visited had foam wedges on the walls, ceiling and underneath a suspended wire mesh floor. The chamber is used to test the iPhone 16’s mics.

Patrick Holland/CNET

The result is a spherical sound profile for each mic made from the data recorded in that anechoic chamber. Apple takes these profiles and uses it as the foundation for spatial audio and other software that can reduce wind noise or make iPhone recorded audio act and sound like different kinds of microphones – think a lavalier mic or in-studio mic for a voiceover.

“We want to enable that [Audio Mix] feature as if you record it on a lapel mic,” Dave explained. “We use machine learning algorithms as well as our tuning chains to come up with that signature sound that you’re able to get even with lapel mics.”

In real life, if I’m using a dedicated camera to record an interview, I can put a wireless lapel mic on a person so it’s close to their mouth. That way the mic picks up their voice as much as possible. But when I film with an iPhone, the mics are where the phone is and can’t be physically closer. And that’s what Dave and his team set out to solve.

“We’ve been doing this development over a number of years,” said Francesca Sweet, director of iPhone product marketing at Apple. “So much of the machine learning capabilities that we have today are built on years of experience and expertise that we’ve been developing.”

And it shows. When I tested the iPhone 16 Pro, I was surprised just how well Audio Mix worked on videos I recorded of my friends talking. It’s not a magic bullet by any means. But when I think back to my days filming low-budget commercials and short films in Chicago with an iPhone 5 and having to use and sync separate audio from an external audio recorder, something like Audio Mix would have saved so much time and stress.

There’s more than just a man with a golden ear

A man with headphones on listening to video clips being played on a monitor

Apple conducts perceptual audio testing with a variety of people and uses their feedback to help tune the audio recording and playback on the iPhone 16.

Celso Bulgatti/CNET

But the testing doesn’t stop there. My next stop on Apple’s lab tour was where comparative playback tests are conducted to help tune the iPhone’s audio. After a maze-like walk from the anechoic chamber, I end up in a hallway with a couple of mini studios. The rooms are soundproof and each has a chair and desk with a Mac Studio, a Studio Display and a pair of AirPods Max headphones on it. 

As opposed to just having one person with a good ear tune the iPhone’s audio, Apple has a number of testers take a perceptual audio test and then uses those results to calibrate what you hear played back on your iPhone. I even got to be one of these testers while I was there and ran through a portion of the experience.

“It is not just Ruchir with his golden ear that’s sitting in there and dictating how it should sound,” emphasized Sweet. “We really want to make sure that anyone who’s taking advantage of this feature [Audio Mix] is going to appreciate it and enjoy it.”

I sat down at the desk and put the AirPods Max on my head. I played a video clip that had two audio tracks and could switch back and forth between them to rate whether I thought the audio was good, bad, fair, excellent and so on. It was that simple. The first video had a group of people on a sidewalk in a big city. The second was a selfie video that a man shot under an umbrella while walking through the rain. In one of the audio tracks in the selfie video the wind noise was very present. In the other track I could hardly hear it.

Apple uses comparative testing like this much in the same way an eye doctor might have you select between two different lenses in an eye test. Without something to compare the audio to, it’s more of a challenge to evaluate a recording. The results from the perceptual testing help influence how the different aspects of the iPhone 16 Pro’s audio works, including Audio Mix.

Audio Mix on the iPhone 16 Pro

The Photos app on the iPhone 16 Pro has a tool called Audio Mix that lets you change the quality of the audio on videos you record.

Celso Bulgatti/CNET

“The idea behind development of Audio Mix is the iPhone gets used in all sorts of different scenarios and all sorts of different soundscapes and all sorts of different environments,” said Dave. “We want to provide that flexibility to our users to be able to capture the sound they would like in those scenarios, rather than saying, ‘This is what we think you should capture the sound,’ and hard coding it.”

Of course, Dave also pointed out that you don’t need to be a video nerd (my words, not his) to have good audio in your iPhone videos.

“We’re not expecting every user to go in and edit the videos and change the sliders,” said Dave. “If you shot it the way we intended it to be shot, it should sound amazing. And if you still want to change it the way you would like, you have the full freedom.”

A private Dolby Atmos theater for iPhone videos

A theater screen with an Apple logo projected on it

When I first entered Apple’s video verification lab, there was only an Apple logo on the theater’s screen.

Celso Bulgatti/CNET

My last stop was the video verification lab. If the anechoic chamber is about minimizing noise and stimuli, this lab, with its movie theater-sized screen, is all about the opposite. I wasn’t able to see every corner of the theater or the control booth, but imagine having your own private Dolby Atmos theater to watch videos that you recorded on the iPhone.

“We use this theater to tune the video playback experiences so that when you play back these videos in a dark room, in an office environment or even under the sun, that you get the same perceptual experience you will get as if you’re watching a video in the theater,” Sean Yang, director of video engineering at Apple, told me.

Apple calibrates every iPhone’s display at the factory to make sure that the color is accurate, the brightness uniformity is good and the peak brightness matches its specs. But Yang and his team are focused on how videos look when you play them back no matter where you are. Unlike a TV or monitor, a phone screen has to contend with outdoor lighting, including direct sunlight, which can overwhelm what you’re watching. The iPhone uses its ambient light sensor to adapt video playback to the lighting environment.

Like the perceptual testing that tunes the audio, Apple has people share their feedback on videos, so it’s not just one person’s opinion of how videos should look when played back on an iPhone.

“We actually have a group of experts at Apple to look at this video. If we have a difference of opinion, we get them together, we debate,” explained Yang. “Oftentimes you need to have some trade-offs, and so we consulted many experts within Apple to make sure that the video comes through as highest quality, no matter where you played it.”

A movie screen and iPhone 16 Pro screen playing back the same video

I compare a reel of various video clips on an iPhone 16 Pro (center) and on the video verification theater’s screen.

Celso Bulgatti/CNET

I got to see a demo of how the theater’s screen mimics what video playback on an iPhone 16 Pro’s screen looks like. There was a mix of clips from films, animated movies as well as videos recorded on the iPhone 16 Pro. And the reel included clips from one of my favorite new iPhone 16 Pro features, its ability to record and playback 4K 120fps slow-motion video.

“4K 120 is a massive amount of it [data]. If you think about it, it’s 1 billion pixels per second,” said Yang.

I did the math. 4K video has 8,294,400 pixels per frame (3,840 x 2,160 pixels). At 120 frames per second, that’s 995,328,000 pixels every second. I don’t know what’s more impressive, the amount of data the iPhone 16 Pro needs to process to get 4K 120fps to work or how great the image quality and dynamic range are in the resulting slow-motion video. 

A frame from an iPhone 16 Pro Slow Motion video

Here is a still pulled from an iPhone 16 Pro slow-motion video.

Patrick Holland/CNET

“What’s important is you can’t really fake it, right? So if you’re capturing at 120 frames per second, and you’re slowing it down to half speed or quarter speed, you really are going to see every single frame in its full detail. And so any kind of artifacts or things like that are going to come through,” said Sweet.

Even if you don’t film a single frame of slow motion, you benefit from the work that Yang and his team do everyday in terms of video playback, whether it’s watching a Disney Plus series like Star Wars: Skeleton Crew or just a selfie video that you recorded with some old friends toasting to the good times.

Watch this: iPhone 16 Pro 4K 120fps Slow Motion Video Test

Apple’s labs are only as impressive as what comes out of them

An iPhone 16 Pro is on the left with the theater's screen in the background.

Here’s my point-of-view during a video verification test. The iPhone 16 Pro is on the left with the theater’s screen in the background.

Patrick Holland/CNET

As I reflect back on the labs I saw and the engineers I met I kept thinking back to something that Sweet told me at the end of the day.

“We try to make it as seamless as possible for a user to interact with these really powerful tools that allow you to manipulate the audio or video and service them in a way that’s really accessible,” said Sweet. “But there is an inordinate amount of engineering work that goes in to make them that simple.”

It’s one thing for Apple to show off the effort and time its engineers have spent to get features like Audio Mix or 4K slow motion to work, but it’s another to be able to use them in an easy and consistent way.

It’s a fascinating perspective to keep in mind the next time you’re filming a short film with your iPhone or just recording your kids playing in the living room.





Source link

You May Also Like

More From Author

+ There are no comments

Add yours