AirPods Reportedly Getting Live Translation Gesture
Marcus Mendes reports for 9to5 Mac this week a bit of new UI spotted in iOS 26 Beta 6, which was released to developers on Monday, suggests Apple is planning to enable real-time translation of live conversations on AirPods. The finding comes after the company announced live translations for FaceTime calls and more at WWDC in June.
“In today’s iOS 26 developer beta 6, we spotted a new system asset that appears to depict a gesture triggered by pressing both AirPods stems at once,” Mendes wrote of the new finding. “The image displays text in English, Portuguese, French, and German, and it is associated with the Translate app. For now, we can confirm it’s associated specifically with the AirPods Pro (2nd generation) and AirPods (4th generation).”
Mendes (rightly) notes using AirPods for translative purposes is “right up the wearable wheelhouse” for products like AirPods and Meta’s Ray-Bans. Indeed, from an accessibility standpoint, using earbuds (or glasses) for translation can be not only more discreet in appearance, but also more accessible in terms of not having to look at, say, Apple’s built-in Translate app while holding it. Such a dance may be hard, if not outright impossible, for those with suboptimal hand-eye coordination. Likewise, it’s highly plausible things like languages are more intelligible for people who are auditory learners or perhaps are neurodivergent. Whatever the case, Mendes is, again, exactly right to posit using wearables for translation is a perfect use case for the technology. Moreover, Mendes is also reasonable in his speculation this feature may have been kept under wraps because Apple plans to make it part of the iPhone 17 software story.
On a related topic, that AirPods are purported to gain a new gesture serves as a good reminder to give a brief shoutout to another AirPods gesture: the head gestures for accepting or declining calls. Much to my chagrin, I get a ton of spam calls every day, which I normally ignore and let go to voicemail. When I’m wearing my AirPods, however, the aforementioned head gestures as a de-facto accessibility feature; instead of reaching for my phone to tap a button, I can merely shake my head to send those spam calls away. To use natural, nigh universally understood methods of nonverbal communication in this manner is genius—and it’s accessible too. Rather than search the abyss of my pocket(s) to hurriedly find my phone and take action on an incoming call, I easily can nod or shake my head as necessary. It’s undoubtedly convenient, as well as technically cool, but it’s also accessibility. Using head gestures to decide on phone calls alleviates a helluva lot of friction associated with using my phone for that.
Yet one more reason to choose AirPods over something like my Beats Studio Buds+.