When Google has begun to recover its Google Glass concept, Meta is already one step ahead with new artificial intelligence functions in glasses this summer. The Ray-ban smart glassesIn part with Meta, several powerful AI updates are being updated for US and Canadian users.
Running the Meta View app on a connected smartphone, Ray-Ban smart glasses users will be able to use the “Hey Meta, Start Live AI” command to give a live view of what Meta AI is viewing through their glasses.
Similar Google’s Gemi DemoUsers will be able to ask Meta AI conversation questions about what it looks and how it can solve the problem. The Meta Meta provided the example of AI when you look at the pantry gives the potential options of butter based on what it looks like.
Even without live AI, you will be able to ask specific questions about the objects you are watching.
In addition to the new seasonal appearance, Ray-Ban’s smart glasses “Hey Meta, Start Live Translation” command will be able to automatically translate languages including English, French, Italian and Spanish. The glasses speakers will translate the other people as they speak and you can hold your phone so that the other party can also see a transcript of transcript.
In addition to these AI upgrades, smart glasses will be able to automatically post on Instagram or send a message to the messenger with the correct voice command. New compatibility with music streaming services will allow you to play music through Apple music, Amazon music and Spotify in your glasses instead of earbods.
Meta has said that the rollout of these new features will happen this spring and summer, as well as the next week will happen with the update for EU users with updates.
Meta and Ray-Ban did not immediately respond to any request for further comments.
