Displays Could Trip Up the Race for the Face

Displays Could Trip Up the Race for the Face
Source: Midjourney - generated by AI (Illustrative only, does not depict solutions in this post)
  • AI glasses are ready today, giving consumers voice-first assistants, cameras, and spatial intelligence that work seamlessly with their phones.
  • Rushing display-equipped AR glasses to market too soon could cause consumer pushback and stall the entire smartglasses category.

Smartglasses are widely expected to be the next major consumer category in our post-smartphone future. The race for the face is already underway, with nearly every major tech giant either in the market or rumored to join soon.

While this category has a number of different form factors, the holy grail is AR glasses with a display that can replace the phone in your hand. But as major tech companies rush toward that end point, I'd argue that there’s no need to get there too quickly. Our near future should look more like Her than Iron Man.

The AI glasses category is coming into its own. Meta Ray-Ban, Oakley HSTN, Alibaba Quark AI Glasses, Google’s expected Android XR glasses in partnership with Warby Parker and Gentle Monster, and most recently, HTC’s VIVE Eagle are all building momentum. These devices are resonating with consumers. EssilorLuxottica’s latest earnings show that Ray-Ban Meta sales more than tripled in the first half of 2025.

Some tech giants are already moving toward displays. Google confirmed an optional in-lens display for its upcoming Android XR glasses, and Meta is rumored to be adding one to its next Ray-Ban model. In my opinion, it’s too early for this shift.

I’ve worn Meta Ray-Ban for over a year now and am no stranger to face-worn tech, having been an early Google Glass Explorer. The Ray-Bans have become a regular part of my life, but if I’m honest, it’s first because they’re fashionable sunglasses and second because they offer a convenient open-ear audio experience for music or calls. For me, the draw is the audio. This seems to be true for many others I’ve spoken to, in addition to another top use, the hands-free, heads-up camera. Right now, people are excited about glasses they already wear getting an upgrade to help them do things they already do on the go, like take a call, listen to a podcast, snap a quick photo.

With AI agents becoming part of our daily lives, they’re also eager for immediate access to an assistant that’s hands-free and voice-first. The added benefit of AI glasses is that they give these assistants spatial intelligence through the combination of a camera and multimodal AI. That makes them more contextual, more situationally aware, and capable of delivering more meaningful interactions. Wearable tech is critical to the evolution of AI agents. Whereas audio and photography may be the reason why consumers buy connected eyewear today, tomorrow it will be because, without them, their AI assistants are just not smart enough.

What’s not needed right now is a display.

Consumers haven't had enough of a taste of a smarter assistant in their ear to justify the upgrade to glasses with a screen. But most importantly, we already have a great screen in our pocket. AI glasses should focus on handing off visual information seamlessly to the phone rather than trying to replace it. This was exactly the setup in the movie, Her: an earpiece, a camera, an AI, and a companion device for when you needed a visual. With AI glasses and our phones, we have all those pieces. That’s the bridge we need now.

Rushing to replace the phone could backfire. Consumers are hesitant to give it up, and not everything the phone does can be replicated yet. Most people will likely carry both devices for the foreseeable future. If we introduce glasses meant to replace the smartphone too early, they will choose the phone and leave their smartglasses collecting dust in the closet.

There are also hard technical reasons to wait. Screens in glasses still offer a small field-of-view, the varifocal distance is unsolved, and extended use can cause eye strain and fatigue. Managing constant visual notifications also requires highly reliable contextual computing, and we’re not there yet. Battery life also takes a hit when you add a display, and nothing is worse than wearing a wearable that is out of juice. Any display-equipped glasses launching today will either have shorter use times or require bulkier designs. These aren’t minor trade-offs and can make or break mainstream adoption.

Eventually, we’ll have a device we can wear all day that lets us leave our phone behind. But the first step is to nail the AI glasses category. They need to be fashionable, comfortable, and affordable devices that serve as ever-present assistants, seeing the world on our behalf. The first step to adoption should be about enhancing the smartphone, not replacing it. When AI glasses become an indispensable staple and people start asking more from them, that’s when it’s time for screens.

Right now, AI glasses have to be the main event, not a stepping stone. They need to prove their value as the next must-have personal device or risk undermining the entire smartglasses category. And they can do this without a display. In the race for the face, timing is everything, and pushing too fast toward AR could stall consumer adoption for smartglasses before it really begins.

Disclosure: Tom Emrich has previously worked with or holds interests in companies mentioned. His commentary is based solely on public information and reflects his personal views