Google Translate Gets Live Translation Enhancements in Latest update

Abner Li reports for 9to5 Google today Google Translate has been updated such that live translation leverages Gemini—including while using headphones. The feature is available in the iOS and Android apps, as well as the Google Translate website and Google Search. Live translation is launching first in the United States and India with the ability to translate from English into over 20 languages such as Chinese and German.

“Google Translate is now leveraging ‘advanced Gemini capabilities’ to ‘improve translations on phrases with more nuanced meanings,’” Li wrote on Friday. “This includes idioms, local expressions, and slang. For example, translating ‘stealing my thunder’ from English to another language will no longer result in a ‘literal word-for-word translation.’ Instead, you get a ‘more natural, accurate translation.’”

(File this under “I Learn Something Every Day”: Google Translate has a web presence.)

As to the real-time translation component, Li says the feature is underpinned by Gemini 2.5 Flash Native Audio and works by pointing one’s phone in the direction of the speaker. He also notes Google says Translate will “preserve the tone, emphasis and cadence of each speaker to create more natural translations and make it easier to follow along with who said what.” Importantly, Li writes the live translation function is launching in beta on Android for now; it’s available in the United States, India, and Mexico in more than 70 languages, with Google further noting the software works with “any pair of headphones.” iOS support and more localization is planned for next year.

“Use cases include conversing in a different language, listening to a speech or lecture when abroad, or watching a TV show/movie in another language,” Li said in describing live translation’s elevator pitch. “In the Google Translate app, make sure headphones are paired and then tap ‘Live translate’ at the bottom. You can specify a language or set the app to ‘Detect’ and then ‘Start.’ The fullscreen interface offers a transcription.”

It doesn’t take an astrophysicist to surmise this makes communication accessible.

At Thanksgiving dinner a couple weeks ago, one of my family members regaled everyone with stories about his recent trip to Paris. He of course knows I’m a technology reporter, and he excitedly told me he bought a pair of AirPods Pro 3 at the Apple Store before his trip so he could try Apple’s own Live Translation feature, powered by Apple Intelligence. I was told it worked “wonderfully” with him being able to hear their French translated to his English piped into his earbuds. It seems to me Google’s spin on live translation works similarly, with the unique part (aside from Gemini) being that it isn’t limited to Pixel Buds. At any rate, language translation is a genuinely good use case for AI—and, more pointedly, a good example of accessibility truly being for everyone, regardless of ability, because it breaks through communicative barriers.

Apple announced Live Translation on AirPods at its fall event in September.

Next
Next

Report: Refreshed Studio Display Found in code