Apple is working on a feature that would allow users to perform live language translations when they squeeze both stems of their AirPods at the same time. The feature, which is currently in development for iOS 26, has been spotted in a system asset image posted by 9to5Mac.
According to reports, the translation feature will be available on AirPods Pro 2 and fourth-gen models, but it’s unclear if it will be exclusive to new product releases like the upcoming iPhone 17 or available across Apple devices.
If launched, this feature could become a key selling point for iPhones and might lead to criticism from competitors. Samsung phones have had live translation capabilities since last year, while Meta’s smart glasses also offer similar features. The development of this feature highlights the growing competition between tech giants in the field of AI-powered translations.
Source: https://www.cnet.com/tech/mobile/are-gesture-enabled-airpod-live-translations-incoming-ios-26-beta-suggests-yes