The demonstrations highlighted several functions that are already operational. The glasses can translate a conversation live, identify the content of a book, or remember objects seen recently through a visual memory driven by the Gemini AI. The wearer can also view projected notes in their field of vision, use apps like Google Maps or YouTube Music, or conduct a search using Circle to Search. All the calculations are performed by the connected phone, significantly lightening the device.
Google has not confirmed if these glasses will eventually be released under its own brand. However, the project is well underway, with several models already tested, including prescription glasses and sunglasses. The integration of corrective lenses is also planned. Meanwhile, Samsung is also preparing its own XR glasses under Android, named “Haean.” They are expected to hit the market sooner and focus on comfort, compatibility with the American OS, and similar functions. The initial versions could be unveiled as early as this year, alongside the Project Moohan headset.