I’m in a dimly lit cafe, seated across from a couple of people who are deciding whether to order coffee, wine or mixed drinks. A Google employee enters holding a strange rig with two phones mounted on it: a Pixel 8 Pro and a Pixel 7 Pro. The employee walks over to the duo’s table, which is lit by a candle and string lights, and starts filming them with both phones at the same time.
Sadly, this cafe doesn’t serve actual cappuccinos or old-fashioneds — it’s part of an elaborate environment where Google tests the cameras in its Pixel phones. It’s called Google’s Real World Testing Lab, and the Pixel camera team invited me and CNET colleague Lexy Savvides to learn about its work to improve video recordings on Google’s flagship devices. We’re the first members of the media who’ve been granted access to the lab.
Read more: Best Google Pixel phone for 2024
Instead of large calibration charts, industrial machines, and employees in white lab coats, there’s a living room set, the aforementioned cafe, and employees wearing retro Jordan sneakers. The lab looks more like a cluster of Ikea displays than a testing room. There are other, secret areas, which we weren’t permitted to enter.
Above the cafe is a lighting grid with fixtures, which gives the lab a bit of a TV studio feel. Each fixture has an adjustable color temperature and intensity so engineers can create the desired ambiance for different tests. The lighting and realistic sets (there’s even a toy dog in the living room set, in lieu of an actual pet) allow the team to re-create a number of scenarios, from a living room awash in late evening light to a sunrise pouring in through the windows of a cafe.
“Real people take pictures in places like living rooms and cafes,” said Isaac Reynolds, Google’s group product manager for Pixel Camera.
The employee who was filming in the cafe earlier with the double-Pixel setup is Jesse Barbon, an engineering technician. He was recording two other Google employees, Kenny Sulaimon and Kevin Fu, both product managers for Pixel Camera, who played the role of cafe patrons at the table. They were all demonstrating a low-light video test designed to show off the differences between video recorded on the Pixel 8 Pro and the older 7 Pro. Take a look at the video that accompanies this story to see the clips from both phones, as well as more of the lab. But yes, the Pixel 8 Pro’s video from this dimly lit test is noticeably better.
“We need to be able to test cameras day in and day out,” Reynolds said. “Morning, night, whenever, we need to test a new feature and we can’t always depend on being able to access our own living rooms to test in just the right lighting.”
The lab’s controlled environment allows technicians like Barbon to test the same scenarios repeatedly to ensure that Pixel phones deliver consistent results. The team wouldn’t have the same control if they ran tests in a Google campus cafe, because the lighting might be different depending on the day, or they might not have access to the same spot to repeat a test in the same exact conditions.
The work the camera team does in this lab is all in pursuit of making Pixel video recordings look better. It’s a daunting task, because for a long time, photos from Pixel phones were ahead of the class, but videos less so.
Watch this: How Google Tests the Cameras in Its Pixel Phones
Smartphone cameras have become vital to our lives. They capture important personal moments, letting us revisit them for decades to come. They also play a significant role in documenting history and current events, as we’ve seen numerous times over the past few years with videos like, for instance, that of George Floyd’s arrest and death, which earned Darnella Frazier, who recorded the graphic killing with her phone, a Pulitzer Prize in 2021.
Google is a hugely influential tech company, so its choices carry weight and cause repercussions beyond the products it makes. It’s important that the Pixel’s cameras are tested in conditions that replicate those in the real world, as in this lab, so people who own Google’s phones know they can use them to effectively chronicle their surroundings, whether in photos or video.
Reynolds and team walk us through how they relentlessly test the Pixel’s cameras and how they improved the quality of video recordings to make them look and sound better. Something I was struck by during my time in the lab is that Google isn’t in pursuit of a scientific ideal for how a video should be. To the Pixel camera team, it’s as much about feel as it is about precision.
“Just producing the correct image doesn’t always mean it’s the right one. There’s always a difference between how you remember a moment, how you want to remember it, and maybe what the color chart said it was. And there’s also a balance to find in there,” Reynolds said.
Video Boost and Night Sight
Have you ever taken a photo and shot a video in a dimly lit space, like a restaurant? The photo comes out looking great, especially if you used night mode, but the video looks just OK by comparison. This isn’t a problem unique to Google; every phone maker faces it. For years, the same computational photography algorithms used to make your photos look better didn’t work with video.
Night Sight mode on Pixel phones uses data taken from multiple images to combine them into a single photo that’s brighter, has better details, and little to no image noise. But doing the same thing for video requires an entirely different scale.
Reynolds said it comes down to two numbers: 12 and 200.
“We introduced the original Night Sight feature years ago to help you take ultra-low-light photos, but it was always a struggle to bring it to video because it’s the difference between processing 12-megapixel pictures and over 200 megapixels per second of video,” Reynolds said.
Processing a one-minute low-light video is the equivalent of processing 1,800 photos (60 seconds x 30 frames per second).
Last fall, Google announced Video Boost: a new Pixel 8 Pro feature that uploads a copy of a video you shot to Google Photos, where the processing is done in the cloud. Video Boost adjusts the exposure, brightens shadows, and improves the color and detail using the same HDR Plus algorithm used for photos. It works on videos recorded in bright light as well as in low light (which Google calls Night Sight video).
In my experience, Video Boost works well, especially for low-light situations. The catch is that all the processing is done later, off-device, and it can take a while before you can see the results. And that’s clearly not as convenient as a Night Sight photo, which applies its HDR Plus algorithm on the device in a matter of seconds.
In another setup in the cafe, two employees play a game of Monopoly, lit by a candle. On the table next to the game is a color chart, a stuffed-animal cat “sleeping,” and a ball made of straw. Dim lighting wreaks havoc on any camera — colors can look inaccurate, and textures (like the cat’s fur, the straw in the ornament and the words on the Monopoly board) can appear muddy and soft.
The color chart on the table is calibrated, so team members know how colors are supposed to appear as they film under different lighting conditions. But that’s only half the process. What’s compelling is how Reynolds and team chase the appropriate feel of a scene. Does it match what you see with your eyes, and the memory you have of that moment later? As more of the tools on our phones lean on algorithms, machine learning and AI, it’s refreshing to see how human and subjective this process is.
Pixel Perfection: A Rare Look at How Google Tests the Pixel’s Cameras
Autofocus, exposure and grease
Did you know that your phone’s camera has grease inside? A camera’s lens is made up of lens elements that move back and forth for autofocus. Grease is used to lubricate the movement of lens elements. Turns out, the lab has another benefit. It allows the team to use their phones just as they do in real life. Specifically, it lets them set down their phones and pick them up to take a photo or video.
Many of us set our phones flat on a table or counter. As a result, the lens elements in the cameras hang to the back, causing the grease on the rails of those elements to build up. If you pick up your phone to take a photo or video, several things need to happen. The lens elements need to move forward, but there’s also all that grease to contend with.
“As the lens is moving focus to where it wants to be, all the grease on that rail has pooled at the back. So you’re sort of pushing the grease and the lens,” explained Reynolds.
With the Pixel, Google wants the camera experience for a user to be consistent, whether the phone has been in your pocket or laying flat on a table.
But there are other considerations when it comes to autofocus and exposure for videos. Unlike with a photo, a video’s subject might move, or the lighting might change, during the course of a video. The phone has to make a number of decisions for both the exposure and autofocus. And like Google’s approach to color accuracy, there’s a difference between being just technically correct and having a video that captures the spirit and feeling of a moment.
“You don’t want things like exposure and focus to waver back and forth. You want them to walk on and be very confident, very stable,” said Reynolds. “Because we can also change the lighting conditions [in the lab], we can change the scene in controlled ways and make sure the camera stays locked on to the right focus and the right exposure.”
Let’s go back to the cafe scenario with the two Google employees choosing what to drink. The scene has a mix of lighting, from string lights and a candle on the table. The Pixel has to choose an exposure that works for both people, no matter their complexions, and also pick one to focus on.
And then there’s the candlelight, which, as it turns out, can be particularly tricky to deal with.
“A candle is a very small point of incredibly bright light. And worse than that, it casts different shadows across the whole room as it moves,” said Reynolds. “You have to make sure that the flickering of the candle doesn’t cause flickering of the exposure. You have to make the camera confident.”
The team provided several demonstrations to show off how the Pixel handles autofocus as well as exposing a video properly and adapting to changes in the lighting. We not only got to see this under a controlled lab environment, but we also went outdoors to the Halo Pavilion outside Google’s headquarters.
Each scenario indoors and out had a choreographed routine designed to challenge the Pixel 8 Pro and 7 Pro on Barbon’s (the engineering technician’s) phone rig. The subjects would walk and turn their heads, or move their hands closer to the Pixels that were recording.
Audio in video
A photo doesn’t have sound, but a video is only as good as its audio. We leave the comforts of the lab’s cafe and settle into the living room set, which comes complete with a mannequin relaxing on a comfy chair. Fu walks us through how the team approaches audio in videos.
For years, the standard way to improve audio involved using frequency tuning. If you’re recording a video of a person talking while it’s windy outside, it can be hard to hear what the person is saying.
“For example, if we want to get rid of the wind, we would say, ‘OK, let’s tweak the frequency so that we don’t pick up wind as much as possible.’ But speech is also low frequency,” explained Fu.
Frequency tuning is a one-size-fits-all approach, and the results are rarely ideal, because, for instance, as it’s reducing the wind noise, it changes the way a person’s voice sounds. So Fu and team focused on training an AI model to identify speech.
“Once we can identify that speech, we can preserve that speech portion of the audio, and then reduce the nonspeech one,” Fu said.
He played us an audio clip of him recording a selfie video while walking. The first version is straight out of the camera without any audio enhancements. You can hear Fu talking in the video, but the background noise is nearly as loud, making it hard to hear everything he’s saying. Then he showed the same video clip with Google’s speech enhancement applied. The background noise is reduced, and Fu’s voice is clear and present.
Wrap up
I thought I knew a lot about smartphone cameras. But spending a few hours in Google’s Real World Testing Lab showed me how much really goes into fine-tuning the Pixels’ cameras. The lab wasn’t at all what I’d expected, but it made complete sense when I saw how Google used it.
Features like Video Boost deliver eye-opening results and feel like a preview of where video could be headed on future Pixel phones. I say that because currently, Video Boost is only on the Pixel 8 Pro. It’ll be interesting to see how Google handles the feature on future Pixel phones and whether that processing will ever be done on-device.
Hearing how the Pixel camera team approaches video recording on the Pixel was definitely a highlight for me. It shows how difficult it is to balance clinical precision with human subjectivity. And that’s important, because smartphone cameras have become our windows onto the world around us.
“When you’re building a piece of hardware, you have to make sure that hardware works week after week properly, through all the different prototypes and factory versions that you get,” said Reynolds. “You can’t do this once and then hope it works forever.”
CNET’s Lexy Savvides contributed to this report.
+ There are no comments
Add yours