At this year’s WWDC 2023 keynote, Apple revealed that it is introducing Adaptive Audio to AirPods. The new feature blends transparency mode with Active Noise Cancellation to match the conditions of your natural surroundings.
Using machine learning, Adaptive Audio will create a more customised and dynamic listening experience so that you don’t have to play around with audio settings on your iPhone while you’re on the move.
Now when your AirPods detect that you're speaking to someone while there’s a lot of background noises with people talking, a new feature called Conversational Awareness will automatically lower your audio's volume and the background noise so that you can focus on the person in front of you.
Another feature called Personalised Volume was also introduced and that aims to make the ambient listening experience – when you’re walking down a street or at work and still want to be aware of your surroundings – a lot more fine-tuned and adaptive (I can see where they got the name).
Conversation Awareness and Adaptive Audio identify background noises such as faraway conversations or a marching band and reduce the volume on these with Personalised Volume – together, all three features work to create a more harmonised listening experience.
It's not adjustable but adaptable
While active noise cancellation is a feature only currently available to the AirPods Pro 2 and AirPods Max, there are only three modes available – ANC, transparency mode and off.
By introducing Adaptive Audio, Apple has added a fourth mode that could offer the middle ground noise cancelling that we've been looking for. We previously wrote about six features I hope Apple adds in iOS 17 at WWDC and adjustable transparency modes was a big one for us.
It comes down to the fact that the transparency mode is limited to either being turned on really high to nothing at all. Adaptive Audio looks to offer the solution by adjusting the level of transparency automatically to match the environment. It looks like Apple really listened.