Recent Meta Ray-Bans Updates make smart glasses Even more alluring for accessibility
Meta late last week announced several updates to its Ray-Ban smart glasses that the Menlo Park-based tech titan says gives the eyewear “new styles and AI updates.” The Spring-focused updates are intended to “supercharge the season,” according to Meta.
While everyone cares about fashion and style, aesthetics aren’t what piqued my interest here. What caught my eye—no pun intended—was the software side of the upgrade story. The headlining feature is Meta’s Live Translation functionality, previously available only in “early access,” is rolling out to everyone. Live Translation is localized in English, French, Italian, and Spanish, with Meta noting “when you’re speaking to someone in one of those languages, you’ll hear what they say in your preferred language through the glasses in real time, and they can view a translated transcript of the conversation on your phone.” Live Translation works even in airplane mode.
Elsewhere, Meta reports Ray-Ban wearers soon will have the ability to send and receive DMs and more from Instagram, which complements existing capabilities through Messenger and WhatsApp. Regarding music, Meta says the Ray-Ban glasses now support expanded access to services like Apple Music, Spotify, and others. Users can ask Meta AI to play a specific song or playlist right from their glasses, with the caveat “as long as your default language is set to English,” according to Meta. What’s more, Meta says users in the United States and Canada will gain the ability to converse with Meta AI right from their sunglasses. Meta’s smart assistant has the ability to “see what you see continuously and converse with you more naturally.” Lastly, Meta says beginning this week, users will able to talk with Meta AI about what it is they’re seeing, replete with real-time responses. This feature is spiritually very similar to Google Lens.
Meta sent me a pair of the original Ray-Bans a couple years ago, which I wrote about alongside Amazon’s Echo Frames. It’s admittedly been some time since I wore either pair with regularity—especially since, in the case of the Echo Frames, I don’t wade knee-deep in the Alexa ecosystem. The Ray-Bans are more agnostic in their allegiances. Nonetheless, I treated both devices much like the inexpensive (and dumb) drugstore sunglasses I’ve bought for years: I wear them to keep the sun out of my eyes.
As someone who is deep in the Apple ecosystem, I’m hearted by this weekend’s news Apple’s Ray-Bans competitor reportedly is “getting closer to becoming a reality.” According to Apple scoopster extraordinaire Mark Gurman of Bloomberg, the company’s glasses aren’t “close to being ready yet,” but writes the idea behind them is to “turn the glasses into an Apple Intelligence device [by analyzing] the surrounding environment and feed information to the wearer, though it will stop well short of true augmented reality.” Apple’s smart glasses would be right up my alley as someone who’s all-in on its ecosystem and, even more crucially, its support for accessibility.
Apple’s aspirations mirror what Meta does with its Ray-Bans, and is fascinating to ponder from an accessibility standpoint. Take the Live Translation functionality, for example. It can be way more accessible to hear aural translations of language in a hands-free way; this could be important for people with limited motor skills who might have trouble holding their iPhone to, say, use Apple’s Translate app to make the conversion(s). Likewise, using smart glasses can make navigation much more accessible largely due to its hands-free nature. You needn’t have to look down at your phone as you hold it to know where you’re going. There are other applications, but suffice it to say, smart glasses flash a ton of potential as an assistive technology.