As iOS 17 is now available, Apple has also released a firmware update for the second-generation AirPods Pro that brings some new features, including Adaptive Audio. The feature combines both Transparency and Noise Cancellation modes together and balances between them based on the external environment. Interestingly, Apple once considered using the iPhone’s GPS to control Adaptive Audio levels – which ended up not happening.
How Adaptive Audio works in AirPods Pro 2
In an interview with TechCrunch, Apple executives Ron Huang and Eric Treski talked about the new features of the AirPods Pro.
In addition to Adaptive Audio, the new firmware also comes with Personalized Volume, which adjusts media volume based on environmental conditions, and Conversation Awareness, which reduces media volume and enhances voices in front of you when you start talking.
When it comes to Adaptive Audio, it may sound similar to Adaptive Transparency, a feature announced last year for AirPods Pro. With Adaptive Transparency, the earbuds constantly monitor external sound to reduce some annoying noises even when Transparency Mode is on. Adaptive Audio, on the other hand, does much more than that.
Treski explained that the new mode works slower than Adaptive Transparency, but that’s because there’s a “much more methodical process to know what you’re listening to” and intelligently adjust the spectrum between Transparency and Noise Cancellation.
The system identifies whether you’re listening to a song or a podcast, while microphones inside the AirPods measure the volume inside your ears to get a sense of the volume the user is experiencing. But during the development of this feature, Apple tried a completely different approach.
The feature almost relied on GPS
Instead of relying on microphones and other sensors, Apple considered using the iPhone’s GPS to determine when the user was in a noisy location to adjust the Adaptive Audio levels. So, for example, AirPods would enter Transparency Mode automatically when the user is walking down the street.
“During early exploration for Adaptive Audio, we basically put you in ANC versus transparency, based on where you are,” Huang explained. “You can imagine the phone can give a hint to the AirPods and say, “hey, you’re in the house” and so forth.”
“After all our learnings, we don’t think that is the right way to do it, and that is not what we did. Of course, your house is not always quiet and the streets are not always loud. We decided that, instead of relying on a location hint from the phone, the AirPods monitor your environment in real time and make those decisions intelligently on their own,” he added.
More tidbits on AirPods
For Personalized Volume, Apple says it has analyzed “hours of different data” on how users listen to different content in different environments to understand their preferences. AIrPods also remember user preferences based on where they are and the amount of noise there – it all happens on the device.
As for Conversational Awareness, the feature not only waits until it detects a predominant voice, but uses accelerometers to detect jaw movements to make sure that the user is talking, not someone else nearby.
The full interview contains some other details, such as Apple executives reinforcing that the revised version of the second-generation AirPods Pro with USB-C uses a new 5GHz wireless protocol to transmit Lossless audio when paired with an Apple Vision Pro.
FTC: We use income earning auto affiliate links. More.