Snap returns with a new pair of smart AR glasses and for the first time in the years is ready to sell them to consumers.
The company plans to sell these new glasses, called specifications, to consumers who began in 2026, CEO Evan Spiegel announced on Tuesday during the augmented World Exhibition in Long Beach, California. A Snap spokesman tells TechCrunch that the glasses will be transported in 2026, too.
Snap specifications will include many of the same augmented reality and artificial intelligence capabilities available in the smart glasses facing the company’s developers, glasses 5.
The specifications will have lenses that see the graphics of users as if they are projected to the world in front of them. The glasses will also have AI assistant powered by SNAP technology, capable of processing both sound and videos.
The specs announcement comes almost a decade after the first Snap attempt tried to sell the consumer’s smart glasses by initial glasses launch in 2016 – a product that ended up selling badly. While Snap was ahead of its time, the company is now facing intense competition in the AR glasses market from giants such as Meta and Google, which recently presented their own AR products.
Meta is reportedly planning to reveal glasses with built -in display, Coded “Hypernova,“Later in 2025. Meanwhile, Google recently announced partnerships with Warby Parker, Samsung and other companies to develop Android XR glasses.
Snap hopes that the Snapos developer ecosystem – which the company has gone through in recent years – will give it an advantage in the AR race. Many of the millions of AR experiences have been made for Snapchat and glasses, called lenses, will also work for the new specifications, the company said.
On stage, Spiegel presented some of these lenses. One of them, “Super Travel”, will translate signs and menus for users in foreign countries. Another Spiegel app presented, “Cookmate”, finds recipes based on available ingredients in a user kitchen and then provides step -by -step cooking guidance.
Companies have relegated these cases of use to AR for years, but have struggled to deliver a pair of smart glasses that are capable, affordable and comfortable enough to give daily consumers a taste of AR. Snap seems to believe that it has been done exactly with the specifications, but several details are still unclear.
Snap did not reveal on Tuesday how much the specifications would cost, how it plans to sell the glasses or exactly what they look like.
SNAP has also announced several developer updates to boost the Snapos platform. Developers can now create applications powered by AI multimodal models from Openai and Google Deepmind. To activate more AI applications, the company has announced an “API Module Depth” that will anchor graphic AR from large linguistic models in three -dimensional space.
In the future, Snap says he will work with Niantic Spatial, the company started by the creator of Pokémon Go, to build maps of the world.
Whether all these efforts will be translated into a pair of smart glasses that consumers will really want to buy remains to see. While Meta has found timely success with Ray-Ban Meta, it seems likely that Snap specifications will be significantly more expensive. To get consumers on the boat, the snap may need to convert AR glasses from some novel into a practical device.
