Last night, I fell sleeping under the stars, the chirping of crickets mingles with the whistling of the old radiator in the distance. I just finished an episode of Justified: City Primeval on the big screen. It was a steady 68 degrees, but I plopped down on the duvet nonetheless. For tonight, I’m thinking about the surface of the moon, or maybe the tip of a Hawaiian volcano.
According to most analytics, the average American spends about seven hours a day in front of screens. The Centers for Disease Control recommends something like two hours. But for all the increased focus on sleep hygiene and the harmful effects of staring at screens all day, it seems society is quickly moving in the opposite direction.
When we refer to “screen time,” we’re largely talking about phones, computers, TVs — that sort of thing. Meanwhile, a completely different paradigm has been looming over the horizon for some years now. In the case of the Vision Pro, we’re talking about two screens — one per eye — with a combined 23 million pixels.
These screens are, of course, significantly smaller than the other examples, but they’re right there in front of your eyes, like a $3,500 pair of glasses. This is something I thought about quite a bit in my first 48 hours with the Vision Pro.
In 2018, Apple introduced Screen Time as part of iOS 12. The feature is designed to alert users to their — and their children’s — device usage. The thinking is that when such stark numbers are presented at the end of each week, people will begin to rethink the way they relate to the world around them. Tomorrow, Apple is finally releasing the Vision Pro. The device is another attempt to make people rethink the way they interact with the world, albeit in a completely opposite direction.
Image Credits: Cory Green/Yahoo
I’ve spent much of the last two years trying to break some of my worst pandemic habits. At the top of the list are all those nights I fell asleep watching some bad horror movie on my iPad. I was better at it. I read more and embrace silence. That is, until this week. The moment the Vision Pro arrived, all that went out the window.
Now, there is a certain degree to which many of these can be deleted as part of my testing process. To test a product, you need to live with it as much as possible. In the case of Vision Pro, that means living my life through the product as much as possible. I take business calls on it and use it to send emails and Slack messages. I listen to music through the headphones and — as mentioned above — use them to catch up on my stories.
Even my morning meditation practice has been transferred to headphones. It’s this classic irony of using technology to address some of the problems it introduced into our lives in the first place.
Although my job requires me to use Vision Pro as much as possible, while I have it, I have to assume that my experience won’t be entirely different from that of most users. Again, you’ll want to get the most out of your $3,500 device while you can, which always translates to using it as much as you can.
When I wrote Day One of this magazine yesterday, I warned users to ease into the world of Vision Pro. In a very real way, I’d like to take my advice to heart. At the end of my first 24 hours, the nausea started to hit me hard. Your results will, of course, vary. I am prone to car and sea sickness myself. That patch you see behind my right ear in some of the Vision Pro photos is for the former. (It’s probably a placebo, but sometimes kidding yourself is the best medicine.)
VR sickness and car sickness actually work in similar ways. They are caused by a mismatch between what your eyes perceive and what your inner ear perceives. Essentially, your brain is getting mixed signals that it’s having trouble coming to terms.
In a way, this phenomenon goes to the heart of a fundamental element in mixed reality. Even in the world of passthrough AR, there’s a disconnect between what you see and what your body feels. The Vision Pro pass is the best I’ve experienced on a consumer device. Cameras record your surroundings and transmit them to your eyes as quickly as possible. Using this technology, the headset can overlay computer graphics onto the real world — a phenomenon Apple refers to as “spatial computing.”


Image Credits: Cory Green/Yahoo
This leads to something important for this brave new world. Extended reality is not reality. It’s the world filtered through a computer screen. Now, we’re getting into an existential argument pretty quickly here.
This week I remembered what said a Samsung executive when confronted with the fact that the company is “faking” the moon with its premium smartphones, “[T]there is no real image here. Once you have sensors to record something, you reproduce [what you’re seeing], and it means nothing. There is no real picture. You can try to define a real image by saying, “I take this photo,” but if you used artificial intelligence to optimize the zoom, the autofocus, the scene – is it real? Or are they all filters? There is no real picture, period.’
Sorry, but I need to be much longer to have this particular discussion. For now, though, the Vision Pro makes me question how comfortable I am in a future where “screen time” largely involves putting them on my face. The result is undeniably exciting, pointing to some incredibly innovative apps in the near future (I’m sure we’ll see a number of these in the initial 600 apps).
Perhaps preparing yourself for the future is a combination of adopting cutting-edge technologies while knowing when it’s time to touch the grass. That 2.5 hour battery pack might not be the worst thing after all.