For years, weekend bike rides have been sacred escapes to me. Every pedal stroke helps melt the stressors that accumulate throughout the week and have collected some gadgets that make these walks better. However, I have learned the harsh way by bringing too many tools together are removed from the ride itself, forcing you to manage a pings network and levels of battery instead of driving only the dirty bike.
Enter Ray-Ban Meta: Smart glasses that made the weekend walks simpler and a little more fun.
Instead of wearing sunglasses, a pair of headphones and confusing with my phone to take photos all over the ride, I now have a device that helps with everything.
Ray-Ban Meta Smart Glasses was a surprise hit with more people than me-says Meta has sold millions of these devicesand Managing Director Mark Zuckerberg said recently Sales tripled over the last year.
Several Reddit yarns and YouTube video They suggest that many peoples wear Ray-Ban post-Ban glasses while cycling. Meta has also caught – allegedly building Next generation of AI smart glasses with Oakley, specially designed for athletes.
I never expected to use my Ray-Ban Metas on the bike. But a few months ago, I decided to try them.
Now, I wear these glasses on bicycles more than anywhere else. Meta got quite a few things right with these smart glasses to convince me that there is something here. It is almost a pleasure to use, and with some upgrades, it could get there.
TechCrunch event
Berkeley, ca
|
June 5
Book now
A key point of sales of Ray-Ban Meta is that it is just a solid pair of Ray-Ban sunglasses-mine is the stylefarer with transition lenses and a clean plastic body.
I found these works well for bike rides, protecting my eyes from the sun, dirt and pollen. They sit comfortably under a bicycle helmet – but maybe not perfect. (More for this later.)
The feature of the murderer of Meta’s smart glasses is the camera sitting above your right and left eyes. My glasses allow me to take photos and videos of things I see on my walks by simply pressing a button in the upper right corner of the frames, rather than confused with my phone – something that feels slightly cumbersome and dangerous on the bike.




While I drove to the Golden Gate Park in San Francisco last weekend, I used the Meta Ray-Ban glasses to take photos of the beautiful blue lake Heron, the dunes covered by shrubs where the park meets the Pacific Ocean and the tree in the park.
Is the camera amazing? No. But it’s very good, and I end up capturing moments that I just would never have if I didn’t wear the glasses. For this reason, I don’t see the camera as a replacement for my phone’s camera, but a way to take more photos and videos completely.
The feature I use most: open ear speakers in the hands of the glasses, which allow me to hear podcasts and music without preventing the noise of people, bicycles and cars around me. Meta is far from the first company to put on glasses speakers – Bose has had a fixed couple for years. But Meta gets the open ear speakers is amazingly good. I am impressed by the sound quality and how little I miss the traditional headphones on these walks.
I found myself chatting with Meta’s assistant AI a little bit on the weekend walks. Recently I asked questions about the nature I was seeing all over the park – such as “hey, meta, look and tell me what kind of tree this is?” – as well as the origin of the historic buildings I saw.
I usually use bike rides as a way of disconnecting the world, so it seemed to be talking about an AI Chatbot during the walks. However, I found that these short questions ran my curiosity about the world around me without sucking me in a rabbit hole and alerts, which is usually the case when I use my phone.
And, again, the biggest thing for these features is that everything comes to a device.
This means fewer things to charge, less clutter in my bicycle box and fewer devices to manage along my route.
Potholes
While the Ray-Ban meta-gums look great for walking, they were not designed with cycling.
Many times, the post-meter-gums Ray-Ban fall under my nose during a upper route. When I bend on the bike and look to see what is in front of me, the thick frames block my point. (Most sunglasses for cyclists have thin frames and nose pillows to solve these problems.)
There are some restrictions on how the Ray-Ban post-gypsy work work with other applications, which is a problem. While I like to take pictures and play music with glasses, for anything else, my phone has to get out of my pocket.
For example, Ray-Ban Meta has a Spotify completion, but I had a hard time getting the AI assistant to play specific playlists. Sometimes, the glasses did not play anything when I asked for a playlist or played a completely playlist.
I would love to see that these integration have improved-and expanded to include more cycling expert integration with applications such as Strava or Garmin.
The Ray-Ban Meta also doesn’t work very well with the rest of my iPhone, which is probably due to Apple’s restrictive policies.
I would love to be able to shoot texts or easily navigate Apple maps with my Ray-Ban post-gypsy, but features as this may not be available until Apple releases its own smart glasses.
This leaves Meta’s assistant AI. The AI feature is often offered as the main point of sale of these glasses, but I often found it missing.
Meta Ai’s voice is not as impressive as other AI voice products from Openai, Purplexity and Google. AI’s voices feel more robotic and I think his answers are less reliable.
I tried the recently launched the Live Video of Ray-Ban Meta Ai Sessions, which were presented for the first time at last year’s Meta Connect conference. The characteristic currents of live videos and sound from Ray-Ban Meta in an AI model in the cloud, with the aim of creating a more seamless way to interact with your AI assistant and let it “see” what you see. In fact, it was a harassed hot mess.
I asked Ray-Ban Meta to locate some of the interesting cars I was cycling near my apartment. The glasses described a modern Ford Bronco as Vintage Volkswagen Beetle, although both look nothing like. Later, my glasses told me with confidence that a 1980s BMW was Honda Civic. Closer but still very different cars.
During the Live AI meeting, I asked AI to help identify some plants and trees. AI told me that a eucalyptus tree was an oak. When I said, “no, I think it’s a tree of eucalyptus,” AI replied, “Oh yes, you’re right.” Experiences like this make me ask why I’m talking about AI at all.
Google Deepmind and Openai also work in AI multimodal sessions such as Meta with its smart glasses. But for now, experiences look away from the end.
I really want to see an improved version of AI Smart Glasses I can get on a bike. The Ray-Ban post-gypsy are one of the most convincing AI devices I’ve ever seen and I could see how wearing them on a walk would be fine after some basic upgrades.
