Ray-Ban are being enhanced with AI-driven visual search capabilities

The Ray-Ban Meta smart glasses are set for significant upgrades, enhancing the capabilities of the social network’s AI assistant. This update introduces real-time information support to the assistant, beginning with tests on new “multimodal” features. These features enable the AI to respond to questions based on the wearer’s surroundings.



Previously, Meta AI had a knowledge limit up to December 2022, restricting its ability to provide current event updates or live data like sports scores, traffic updates, or other on-the-go information. However, Meta CTO Andrew Bosworth has announced a change. All Meta smart glasses in the U.S. will now have access to real-time information, partially enabled by Bing.









The upcoming “multimodal AI” feature, initially showcased at Connect, is particularly notable. It allows the AI to answer contextual questions based on what the wearer sees through the glasses. This enhancement aims to make Meta AI more practical and less of a novelty, addressing some initial criticisms of the smart glasses. The early access beta version of this multimodal functionality will be available to a select group in the U.S. who opt-in, with broader availability expected in 2024.



Mark Zuckerberg and Andrew Bosworth have demonstrated the new capabilities with videos and screenshots. For instance, Zuckerberg used the command “Hey Meta, look and tell me,” to get outfit suggestions and identify objects like fruits or translate meme texts. Bosworth also mentioned that users could ask about their immediate environment and generate creative captions for newly taken photos.


The post Ray-Ban are being enhanced with AI-driven visual search capabilities appeared first on GagsHub.


http://dlvr.it/T07LX4

Comments

Popular posts from this blog

ROSALIA MOVES TO MIAMI OFFICIALLY THIS IS HOW SHE CELEBRATED

JENNIFER LOPEZ DEBUTS NEW SONG AND MUSIC VIDEO FOR ‘CAN’T GET ENOUGH’

Effective Strategies for Managing Anger in the Workplace