Putting Apple’s Live Translation to the Test
Published on
November 20, 2025
CEO & Co-Founder, Jaide Health
Putting Apple’s Live Translation to the Test
Published on
November 20, 2025
CEO & Co-Founder, Jaide Health
Table of Contents

As the CEO of an AI Language Company, I get a lot of questions about Apple's new Live Translation functionality. It's a huge moment for our industry when a tech giant does something new in this space, so I knew I had to test it out myself.

High level: I really wanted this to work seamlessly like Star Trek’s Universal Translator or Hitchhiker’s Guide to the Galaxy’s Babelfish, and it’s clear that Apple wants that too; however, the experience just isn’t there yet – I still very much prefer turn-based translation as opposed to “live translation.” This is the same reaction I had when Google launched translation inside of Meet. In both cases, it is just too easy to get into a situation where neither party knows what to do to make the conversation flow.

Here is the deep dive into my experience:

📱 What You Need to Get Started

Live Translation is a new feature (still in Beta) enabled by Apple Intelligence, meaning it requires specific, newer hardware and a bit of setup. It cost me $1000 to try this out as I had to buy a brand new iPhone to use it (fortunately, I needed an upgrade anyway).

Here is the essential checklist for using Live Translation:

Component Requirement Note
iPhone or iPad iPhone 15 Pro or later, iPad mini (A17 Pro), iPad (M1 or later) Must be running iOS 26 or later with Apple Intelligence enabled.
Apple Intelligence Must be enabled on your iPhone Can be turned on under Settings > Apple Intelligence & Siri.
AirPods AirPods 4 (with ANC), AirPods Pro 2, or AirPods Pro 3 Must have the latest firmware installed. AirPods are not required to use Live Translation in Messages, Phone, or FaceTime.
Languages Local download required You must download the language model to your phone for the language you are speaking and the language you want translated. Apple currently supports: English (US, UK), French, German, Portuguese (Brazil), and Spanish (Spain).
Connectivity Local processing is key Good connectivity is required for setup and to use Phone, Messages, or FaceTime apps. Translations use a local model once downloaded, so it’s good for when there’s poor internet, but setup requires a good network connection to download language models quickly.

How it Works and Initial Impressions

For those of you who haven’t tried it, Live Translation provides near simultaneous translation of speech between two languages. My primary focus of this post is on the Live Translation with AirPods use case.

Notably, the user experience is different for each of the participants:

  • For the AirPod wearer: When the other person starts speaking, the AirPods use built-in automatic noise cancellation to muffle the other speaker’s voice and then it speaks the translation over the muffled audio.
  • For the non-AirPod wearer: They can read the generated translation on the AirPod wearer’s iPhone or press the play button next to the speech bubbles to have it read aloud.

In all scenarios, a dynamic transcription and translation is displayed on the screen of the iPhone.

The Simultaneous Translation Challenge

I found the simultaneous translation experience to be particularly disconcerting because even though the speaker’s voice is muffled, you can still hear both languages being spoken simultaneously. This is especially distracting if you have any facility with the other language.

The experience is best suited to 1:1 conversations in a quiet environment. When there are multiple participants (or just loud people in the background), it can quickly become very confusing to follow, especially because there is a definite delay (sometimes as much as 5-10 seconds) between when someone speaks and when you hear the translation, which makes it very easy to lose the thread on who spoke what you are hearing translated.

Live Translation is also available via the Phone app; however, I found that in addition to the above issues, it is an especially confusing experience for the person on the other end of the call if they are not expecting a live translated call. I feel like there should be some sort of announcement when translation is enabled in the other person’s language letting them know what’s going on. 

Live Translation in FaceTime is a different experience, with translations appearing as subtitles on the display. There was no spoken translation, even with my AirPods in. Of note, the translation experience is one sided. The other person does not see translations on their end unless they also have a Live Translation enabled device and have turned it on.

Conclusion

I am excited by all of the innovation that is happening with translation AI as well as the accelerating trajectory of its integration into common communication tools. I love the idea of using AirPods to provide “in ear” translation. However, I think it will still be some time before Live Translation works seamlessly enough and with high enough quality for all parties involved for it to supplant turn-based translation.

About Jaide Health

Jaide Health is a technology company developing AI-powered healthcare-specific, foreign language interpretation and translation services. Leveraging recent advances in large language models, Jaide Health enables healthcare organizations to improve the overall experience for patients with Limited English Proficiency. Led by “techy clinicians” and security/privacy experts, Jaide Health is venture backed by Inovia Capital, Flare Capital, and Innovation Global. Follow us on LinkedIn and visit www.jaide-health.com to learn more.

Connect with Jaide Health

Follow us on LinkedIn and visit www.jaide-health.com to learn more.

Keep Reading