The Ray-Ban Meta smart glasses are getting AI-powered visual search features Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

The Ray-Ban Meta smart glasses are about to get some powerful upgrades thanks to improvements to the social network’s AI assistant. The company is finally adding support for real-time information to the onboard assistant, and it’s starting to test new “multimodal” capabilities that allow it to answer questions based on your environment.

Up to now, Meta AI had a “knowledge cutoff” of December 2022, so it couldn’t answer questions about current events, or things like game scores, traffic conditions or other queries that would be especially useful while on the go. But that’s now changing, according to Meta CTO Andrew Bosworth, who said that all Meta smart glasses in the United States will now be able to access real-time info. The change is powered “in part” by Bing, he added.

Separately, Meta is starting to test one of the more intriguing capabilities of its assistant, which it’s calling “multimodal AI.” The features, first previewed during Connect, allow Meta AI to answer contextual questions about your surroundings and other queries based on what your looking at through the glasses.

Meta

The updates could go a long way toward making Meta AI feel less gimmicky and more useful, which was one of my top complaints in my initial review of the otherwise impressive smart glasses. Unfortunately, it will likely still be some time before most people with the smart glasses can access the new multimodal functionality. Bosworth said that the early access beta version will only be available in the US to a “small number of people who opt in” initially, with expanded access presumably coming sometime in 2024.

Both Mark Zuckerberg shared a few videos of the new capabilities that give an idea of what may be possible. Based on the clips, it appears users will be able to engage the feature with commands that begin with “Hey Meta, look and tell me.” Zuckerberg, for example, asks Meta AI to look at a shirt he’s holding and ask for suggestions on pants that might match. He also shared screenshots showing Meta AI identifying an image of a piece of fruit and translating the text of a meme.

In a video posted on Threads, Bosworth said that users would also be able to ask Meta AI about their immediate surroundings as well as more creative questions like writing captions for photos they just shot.

This article originally appeared on Engadget at https://www.engadget.com/the-ray-ban-meta-smart-glasses-are-getting-ai-powered-visual-search-features-204556255.html?src=rss The Ray-Ban Meta smart glasses are about to get some powerful upgrades thanks to improvements to the social network’s AI assistant. The company is finally adding support for real-time information to the onboard assistant, and it’s starting to test new “multimodal” capabilities that allow it to answer questions based on your environment.
Up to now, Meta AI had a “knowledge cutoff” of December 2022, so it couldn’t answer questions about current events, or things like game scores, traffic conditions or other queries that would be especially useful while on the go. But that’s now changing, according to Meta CTO Andrew Bosworth, who said that all Meta smart glasses in the United States will now be able to access real-time info. The change is powered “in part” by Bing, he added.
Separately, Meta is starting to test one of the more intriguing capabilities of its assistant, which it’s calling “multimodal AI.” The features, first previewed during Connect, allow Meta AI to answer contextual questions about your surroundings and other queries based on what your looking at through the glasses.
Meta
The updates could go a long way toward making Meta AI feel less gimmicky and more useful, which was one of my top complaints in my initial review of the otherwise impressive smart glasses. Unfortunately, it will likely still be some time before most people with the smart glasses can access the new multimodal functionality. Bosworth said that the early access beta version will only be available in the US to a “small number of people who opt in” initially, with expanded access presumably coming sometime in 2024.
Both Mark Zuckerberg shared a few videos of the new capabilities that give an idea of what may be possible. Based on the clips, it appears users will be able to engage the feature with commands that begin with “Hey Meta, look and tell me.” Zuckerberg, for example, asks Meta AI to look at a shirt he’s holding and ask for suggestions on pants that might match. He also shared screenshots showing Meta AI identifying an image of a piece of fruit and translating the text of a meme.
In a video posted on Threads, Bosworth said that users would also be able to ask Meta AI about their immediate surroundings as well as more creative questions like writing captions for photos they just shot.This article originally appeared on Engadget at https://www.engadget.com/the-ray-ban-meta-smart-glasses-are-getting-ai-powered-visual-search-features-204556255.html?src=rss  Read More site|engadget, provider_name|Engadget, region|US, language|en-US, author_name|Karissa Bell Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics 

Share

Leave a Reply

Your email address will not be published. Required fields are marked *