Home » Ray-Ban Meta smart glasses can now describe landmarks for you

Ray-Ban Meta smart glasses can now describe landmarks for you

Ray-Ban Meta smart glasses can now describe landmarks for you

[[{“value”:”

Ray-Ban’s Meta smart glasses are getting smarter.

The camera-equipped spectacles, launched in December, can now use their multimodal AI smarts to retrieve information on popular landmarks.

Meta CTO Andrew Bosworth, who shared the news on Threads, shared some examples of how this works in practice. For example, asking the glasses (through the built-in mics) for “a cool fact” about the Golden Gate Bridge, while looking at said bridge, nets a result in which the glasses tell you (through the built-in speakers) about the bridge’s famous International Orange color.

In another example, the Meta glasses share some info about San Francisco’s Coit Tower.

All of this is currently available for beta testers only; those without access can sign up on the waitlist on Meta’s website.

Bosworth shared a few other tidbits on the Meta glasses’ smart features. The hands-free experience has gotten an update that lets users share their latest Meta AI interaction on WhatsApp, Messenger, or send it as a text message. You can also share the last photo you took with a contact of yours. Finally, podcast listeners will “soon” be able to configure Meta AI readouts to be slower or faster; the option will be available under voice settings.

The Ray-Ban Meta smart glasses (read our review here) have gotten a bit brainier this December thanks to Meta’s AI wizardry, elevating them from a gadget that basically serves for taking photos and videos to something you might actually want to use when you need an AI voice assistant’s help. The landmark-describing feature, while fairly narrow in scope, is a perfect fit for the glasses, and we hope to see more such features added to the Meta glasses in the future.

“}]] Mashable Read More 

​ Ray-Ban’s Meta smart glasses are getting the ability to describe popular landmarks.