Thoughts On AirPods, Cameras, and Accessibility
Ryan Christoffel reports for 9to5 Mac this week a new feature in iOS 26 is a harbinger of camera-equipped AirPods Pro. Among the enhancements coming to AirPods this year is a Camera Remote feature. The functionality is similar to that on Apple Watch, whereby users can use the Watch as a shutter button; as Christoffel writes, users can “use AirPods to capture photos and video on your iPhone in situations that’s helpful… either by pressing once on the AirPods stem or pressing and holding—your choice.”
Christoffel goes on to say the Camera Remote feature on AirPods is notable because rumors suggest the next generation AirPods Pro are, again, said to include cameras. A release timeframe is unknown outside of Apple Park, but Bloomberg’s Mark Gurman has reported the refreshed earbuds feature “external cameras and artificial intelligence to understand the outside world and provide information to the user.” That iOS 26 does include the Camera Remote feature is evidence Apple is getting its proverbial ducks in a row by ensuring its software can support its forthcoming hardware whenever it comes.
There’s much to extrapolate from Christoffel’s informed speculation, not the least of which is how intrepid observers (and spelunkers) oftentimes will notice Apple setting the stage in advance of grand reveals. The size class APIs introduced at WWDC 2014 presaged the iPhone 6 and 6 Plus announcement a few months later. HealthKit, also new in 2014, was destined for Apple Watch. At WWDC 2015, iPad multitasking by way of Slide Over and Split View were announced prior to the iPad Pro’s unveiling. And in more recent times, the advent of ARKit in 2017 seemingly provided clues Apple was working on something using augmented reality; that hunch proved prescient as the company eventually revealed Vision Pro at WWDC 2023. It could even be persuasively argued the all-new Liquid Glass design language, ostensibly meant for today’s devices, was created with an eye towards still-in-development AR-powered glasses for tomorrow.
In an accessibility context, the Camera Remote feature on AirPods strikes me as fascinating. To wit, it’s entirely plausible someone who wears AirPods also doesn’t have use of their arms/hands, whether wholly or in part. While it’s possible today for said person to use voice to control playback and the like, having Camera Remote on AirPods would give a person whom I mention another avenue through which to accessibly take pictures. Likewise, that Gurman said Apple reportedly plans to imbue these future AirPods Pro with AI chops such that people could better “understand the outside world” means, for example, navigation could become easier, as might people recognition. Imagine Siri telling a Blind or low vision person, Her-style, that the person approaching them to say hello is their brother or sister or someone else in their digital rolodex. There are other possible applications, of course, but these examples are intriguing because they potentially can make the world literally more accessible to disabled people. It’s cool tech, but also genuinely useful—and empowering to boot.
Such a sentiment is a common refrain with AirPods Pro lately.