Meta’s Smart Glasses Are Becoming Artificially Intelligent. We Took Them for a Spin.
In a sign that the tech industry keeps getting weirder, Meta soon plans to release a big update that transforms the Ray-Ban Meta, its camera glasses that shoot videos, into a gadget seen only in sci-fi movies.
Next month, the glasses will be able to use new artificial intelligence software to see the real world and describe what you’re looking at, similar to the A.I. assistant in the movie “Her.”
The glasses, which come in various frames starting at $300 and lenses starting at $17, have mostly been used for shooting photos and videos and listening to music. But with the new A.I. software, they can be used to scan famous landmarks, translate languages and identify animal breeds and exotic fruits, among other tasks.
To use the A.I. software, wearers just say, “Hey, Meta,” followed by a prompt, such as “Look and tell me what kind of dog this is.” The A.I. then responds in a computer-generated voice that plays through the glasses’ tiny speakers.
The concept of the A.I. software is so novel and quirky that when we — Brian X. Chen, a tech columnist who reviewed the Ray-Bans last year, and Mike Isaac, who covers Meta and wears the smart glasses to produce a cooking show — heard about it, we were dying to try it. Meta gave us early access to the update, and we took the technology for a spin over the last few weeks.
The artificial intelligence technology in Meta’s new Ray-Ban smart glasses uses cameras and image recognition to give the wearer information about what he or she is looking at.Credit…Aaron Wojack for The New York Times