Your Tech Story

Meta’s AI for Ray-Ban Smart Glasses Can Identify Objects and Translate Languages

Meta’s AI for Ray-Ban Smart Glasses Can Identify Objects and Translate Languages

With the Meta Ray-Ban smart glasses, Meta is now allowing users to test out its most eye-catching AI functions, albeit initially through an early access test. With the announcement today, Meta will start introducing out its multimodal artificial intelligence functions, which will notify you about what Meta’s AI assistant is able to observe and hear using the glasses’ cameras and microphones.

In an Instagram video, Mark Zuckerberg showcased the change by asking the glasses to choose trousers that go with the garment he was carrying.

In response, it described the shirt and provided a few outfit ideas that might go well with it. He also requested that the artificial intelligence helper in the glasses interpret text and display a few picture captions.

Multimodal AI

In a September Decoder interview, Zuckerberg disclosed the multimodal artificial intelligence (AI) characteristics of Ray-Ban spectacles similar to those of Alex Heath of The Verge. According to Zuckerberg, users would speak with the Meta AI assistant during the day about various questions, implying that it would respond to inquiries about the wearers’ whereabouts or the items they are currently looking at.

Meta’s AI for Ray-Ban Smart Glasses Can Identify Objects and Translate Languages

Image Source: isp.page

Additionally, in a video from the chief technological officer Andrew Bosworth, an artificial intelligence helper correctly identified a lit-up wall mural in the design of California. He went over some of the other functions, that involve requesting the assistant to translate and summarise text that you’ve captured or requesting it for assistance to caption photographs. These are all very standard artificial intelligence (AI) features found in other devices from Google and Microsoft.

In the United States, the evaluation period will only be available to a tiny percentage of individuals who opt in, according to Bosworth. Here are the instructions for making an opt-in.

Meta In September, CTO Andrew Bosworth informed me that although the glasses require a voice command to turn on and “see” to preserve battery life,

Eventually, they'll have sensors that are low power enough that they're able to detect an event that triggers an awareness that triggers the AI. That's really the dream we're working towards."

scmp.com

Leave a Comment

Your email address will not be published. Required fields are marked *