Google will release its first AI glasses in 2026, according to a company blog post.
At Google’s I/O event in May, the company announced partnerships with Gentle Monster and Warby Parker to create consumer wearables based on Android XR, the operating system that powers Samsung’s Galaxy XR handsets.
But you can’t wear bulky headphones while out in the real world, which makes smart glasses appealing as less obtrusive smart wearables.
“For AI and XR to be truly useful, hardware needs to fit seamlessly into your life and match your personal style,” Google writes. “We want to give you the freedom to choose the right balance of weight, style and draft for your needs.”
Google is working on several types of glasses with artificial intelligence – one model is designed for assistance without a screen, using built-in speakers, microphones and cameras to allow the user to communicate with Gemini and take photos. The other model has an in-lens display – visible only to the wearer of the glasses – that can show turn-by-turn directions or closed captioning.
Google also shared a preview of Xreal’s wired XR glasses called Project Aura. This model sits between a bulky headset and a discreet pair of glasses. Beyond a screen-in-a-lens, Project Aura glasses can act as an extended workspace or entertainment device, allowing the user to use Google’s suite of products or stream video as they would in more advanced headsets.
While Meta has led the way in developing smart glasses, Google now joins Apple and Snap among the companies expected to challenge Meta with their own hardware next year.
Meta’s smart glasses have caught on thanks in part to its partnership with Ray-Ban, and it sells these products in retail stores. Google’s partnership with Warby Parker looks set to follow a similar strategy, committing $75 million so far to support the eyewear company’s product development and commercialization costs. If Warby Parker achieves certain milestones, Google will commit an additional $75 million and receive an equity stake in the brand.
