CNET recently featured a piece describing the inner workings of Apple’s iPhone 16 camera labs, although calling them “secret” might be an exaggeration since they have been showcased before.
Nonetheless, the article provides an engaging glimpse into Apple’s meticulous testing procedures during the iPhone’s development.
We first got a look at Apple’s anechoic chamber in 2018 when the company invited The Loop for a tour, and there was another reveal just a few months ago when Apple shared images in a press release about AirPods.
CNET explored how Apple evaluates the microphones that capture audio for video recordings, specifically showcasing the testing with the iPhone 16.
In the chamber, there’s an array of approximately twenty speakers arranged in a curved formation, extending from the floor to the ceiling. These speakers emit a series of sounds, allowing engineers to analyze how well the iPhone 16 Pro’s microphones register the audio. The device, mounted on a rotating base, turns slightly, and the sounds are played again. This process continues until the iPhone has made a complete rotation.
The outcome is a comprehensive sound profile for each microphone, derived from the data collected within the anechoic chamber. Apple utilizes these profiles as the groundwork for spatial audio and software enhancements that minimize wind noise or simulate various microphone types, such as lavalier microphones or studio mics for voiceovers.
“Our goal is to provide capabilities as though you recorded with a lapel microphone,” explained Dave, an Apple engineer. “We leverage machine learning algorithms and precise tuning techniques to achieve that signature sound even with lapel mics.”
Interestingly, while the final evaluation is primarily done by Ruchir Dave, Apple’s acoustics lead, the company also gathers feedback from everyday iPhone users.
Rather than relying solely on one expert to fine-tune the audio, Apple involves multiple testers in a perceptual audio assessment. The findings influence how audio playback is calibrated on iPhones. During the visit, I had the chance to be one of these testers and participated in part of this assessment […].
Apple employs comparative testing similar to that used by optometrists who ask patients to choose between different lenses. Having a reference point allows for more accurate evaluation of audio recordings. The outcomes from this perceptual testing play a crucial role in shaping various audio features in the iPhone 16 Pro, including the Audio Mix function.
For video quality assessments, Apple has established a video verification lab. This facility ensures that the videos produced maintain a consistent quality across various real-world settings.
“This space allows us to optimize the video playback experience, ensuring that whether you’re viewing a video in a dark room, an office, or outdoors, it delivers a similar perceptual experience as when viewed in a cinema,” shared Sean Yang, Apple’s director of video engineering.
While the article might not disclose any groundbreaking secrets, it definitely highlights the extensive care that Apple invests in perfecting their audio and visual technologies.