Meta is improving the capabilities of the Ray-Ban smart glasses through multimodal AI features. These new AI tools allow the smart glasses to respond to queries about what the wearer sees and hears.
Mark Zuckerberg demonstrated the multimodal AI's capabilities on Ray-Ban glasses in an Instagram reel, where he held a shirt in front of the smart glasses and asked the AI to suggest a pair of pants that would go well with the shirt's design.
The feature is in early access, but only “a small number of people” in the USA can opt-in to test the new system (via The Verge). As yet, there doesn't seem to be an official public release date for these AI tools.
In essence, this new multimodal AI update brings the Meta Ray-Ban smart glasses closer to the concept of augmented reality.
Thanks to these clever AI-driven object recognition capabilities, users can query the virtual assistant about whatever they're looking at by speaking the keyphrase “Hey Meta.” Users can also ask for summarization and translation.
While Meta is taking a more subtle approach to augmented reality through the Ray-Ban smart glasses, Samsung and Apple are developing full-fledged XR headsets similar to the Meta Quest Mixed Reality lineup. The two tech giants are expected to release their XR headsets in 2024.
Image credit: Ray-Ban