Meta has upgraded its smart glasses technology in partnership with Ray-Ban, offering advanced artificial intelligence features for US and Canadian users this summer. The revamped glasses will include a new “Hey Meta, start live AI” command, allowing users to ask conversational questions about objects they see through their lenses.
When connected to the Meta View app on a smartphone, users can give Meta AI a live view of what they’re seeing, enabling more interactive conversations. This feature is similar to Google’s Gemini demo and will allow users to get answers about specific objects or even suggest alternatives for everyday items like butter in a pantry.
Other key features include automatic translation through the “Hey Meta, start live translation” command, which will translate languages including English, French, Italian, and Spanish in real-time. The glasses’ speakers will also capture this audio and display it on connected smartphones, allowing users to read translated transcripts.
However, concerns about data collection and privacy are raised with the introduction of these smart glasses. Illumex CEO Inna Tokarev Sela expressed worries that the recording indicator light could make some people uneasy, especially if they’re concerned about being filmed by strangers or Meta’s visual data collection practices. To address this, the new models will allow users to control the notification light, and Meta is expected to keep user data private unless explicitly given consent.
Additional updates include post-and-share features on Instagram and message sending capabilities through voice commands, as well as compatibility with music streaming services like Apple Music, Amazon Music, and Spotify. The rollout of these new features is set for this spring and summer, with object recognition updates arriving in late April and early May for EU users.
Source: https://www.cnet.com/tech/mobile/with-hey-meta-ray-ban-wearers-will-unlock-all-new-ai-abilities-and-privacy-concerns