According to some recent pieces of information, Meta is going to add a few AI features to the Ray-Ban smart glasses. Reportedly, the company will add these features through multimodal AI. The smart glasses can now reply to questions about what the user sees and hears, thanks to these enhanced AI capabilities.
The new multimodal AI’s capabilities on Ray-Ban glasses were showcased by Mark Zuckerberg in an Instagram reel. In the reel video, the CEO asked for suggestions for a pair of pants that would match the shirt he was holding in front of the smart glasses. The feature is currently accessible to a few users in the US. Users can opt to test the new feature, according to The Verge’s report.
In addition to this, there are no details regarding the official launch of these AI tools. This latest multimodal AI update, in essence, moves the Meta Ray-Ban smart glasses closer to the idea of augmented reality. By using the keyphrase “Hey Meta,” users can ask the virtual assistant questions about anything they’re looking at because of these ingenious AI-driven object recognition capabilities. Additionally, users can request translation and summarization.
While Apple and Samsung are creating full-fledged XR headsets that resemble the Meta Quest Mixed Reality portfolio, Meta is approaching augmented reality more subtly with its Ray-Ban smart glasses. In 2024, the two tech behemoths are anticipated to unveil their XR headsets.
Research Snipers is currently covering all technology news including Google, Apple, Android, Xiaomi, Huawei, Samsung News, and More. Research Snipers has decade of experience in breaking technology news, covering latest trends in tech news, and recent developments.