Meta's AI Features for Ray-Ban Smart Glasses
Identification and Translation:
Meta's AI for Ray-Ban smart glasses excels in identifying objects and translating languages.
Virtual Assistant Activation:
Uttering "Hey Meta" while wearing the glasses summons a virtual assistant capable of perceiving the user's surroundings through both visual and auditory senses.
Multimodal AI Rollout:
Meta is introducing its most advanced AI features for Ray-Ban smart glasses, initiating with an early access test phase. The multimodal AI utilizes the glasses' cameras and microphones to provide information about the user's surroundings.
Mark Zuckerberg's Demonstration:
Mark Zuckerberg showcased the AI update in an Instagram reel, where the glasses successfully recommended pants to complement a held shirt. The demonstration also included the AI assistant's translation capabilities and image captioning skills.
Interview Insights:
In an interview with The Verge's Alex Heath, Zuckerberg revealed that users could interact with the Meta AI assistant throughout the day, asking questions about their surroundings or activities.
Additional Features:
The AI assistant can describe and assist in captioning photos, offer translation services, and provide summarizations—features commonly found in products from major tech companies.
Limited Testing Period:
The testing of these advanced AI features is initially restricted to a small number of users who opt in, with instructions available for participation.