Ray-Ban Meta Smart Glasses, which were introduced last year, are set to receive a series of updates, as announced by Meta CEO Mark Zuckerberg. These updates coincide with the announcement of new products such as the Meta Quest 3S and Project Orion, a pair of AR glasses not yet ready for market.
Among the new features highlighted by Zuckerberg are more natural conversations facilitated by Meta AI, the assistant integrated into the Ray-Ban Meta Smart Glasses. Additionally, live language translation for one-on-one interactions and support for the “Be My Eyes” feature, which enables volunteers to assist low-vision users by describing visuals through the glasses, are being introduced.
A notable new feature intended to resolve a common issue for drivers is the provision of reminders about where they parked their car. Zuckerberg elaborated on this enhancement during Meta Connect 2024, stating, “We’re adding the ability for the glasses to help you remember things.” An example was given in which a user could say, “Hey Meta, remember where I parked,” and the AI would note the location. Later, when the user asks, “Hey Meta, remember where I parked?” the AI would respond with the parking spot details, such as “You parked in space 9702.”
This parking reminder is part of a broader range of conveniences offered by the Ray-Ban Meta Smart Glasses. Zuckerberg mentioned additional use cases like grocery shopping, where the glasses could remind users to pick up items like an avocado needed for a smoothie later in the day.
While a precise release date for these updates was not provided, Zuckerberg indicated that they are expected to be available soon.
For further details on the capabilities of the Ray-Ban Meta Smart Glasses, an announcement piece on these AI-powered spectacles is available.