These AR Experiments on Spectacles are a Glimpse at What’s Coming

These AR Experiments on Spectacles are a Glimpse at What’s Coming
Source: Snap
  • Snap’s fifth-generation Spectacles ’24 give developers a powerful AR platform with waveguide displays, dual Snapdragon chips, spatial audio, and advanced sensors, enabling immersive, hands-free experiences built around embodied interaction, AI-driven assistance, and real-world play.
  • Early experiments range from palm-based navigation and multiplayer AR games to AI cooking assistants, plant care companions, and remote collaboration tools, illustrating how AR glasses can merge the digital and physical for more intuitive, contextual, and connected experiences.

Snap’s journey into spatial computing began in November 2016 with the launch of its first Spectacles, camera glasses built to capture circular videos for Snapchat. What started as a playful experiment in storytelling sold out of vending machines has evolved over five generations into a serious developer platform for augmented reality. In 2021, Spectacles 4 introduced built-in waveguide displays and AR Lenses. Then, on September 17, 2024, Snap revealed Spectacles ’24, its most advanced model yet. The fifth generation of Spectaacles are fully standalone AR glasses powered by Snap OS and built specifically for developers and creators.

Spectacles ’24 combine dual Snapdragon chips, four cameras (two RGB, two infrared), six microphones, spatial audio, and a 46-degree waveguide display in a 226-gram wearable frame. With hand tracking, voice input, Wi-Fi 6, GPS, and a vivid outdoor-capable screen, the glasses support immersive, hands-free AR experiences for up to 45 minutes per charge. Access is offered through a $99/month developer subscription, giving early builders tools to prototype and experiment with the device and its growing capabilities.

At AWE USA 2025, CEO Evan Spiegel took the stage to announce Specs, the first Spectacles designed for consumers. Lighter, more refined, and aimed at all-day wear, Specs are slated for release in 2026 and will run on Snap OS with deeper AI integration, a more compact waveguide system, and a focus on everyday utility ranging from messaging and navigation to on-the-go creation and ambient computing.

This post explores a wave of early experiments built on Spectacles ’24, from multiplayer games and robot control interfaces to fitness tools, cooking assistants, and educational overlays. Each one stands as a functional app, but looked at together, they offer a preview of our post-smartphone future. One, we may begin to realize as early as next year with the introduction of Snap's consumer-targeted AR glasses, Specs.

Embodied Interfaces

Spatial computing is dissolving the boundary between user and device. With hand gestures, gaze, and movement as primary inputs, AR glasses like Spectacles ’24 are turning our bodies into the interface. This shift demands new design patterns that are more intuitive and less interruptive, and challenges creators to rethink interaction as something felt, not tapped.

Navigation is one of those use cases that is a natural fit for augmented reality, and this experiment from XR Product Engineer, Tejas Shroff, shows why. Using Spectacles, he reimagined walking navigation in AR by centering the experience around hand gestures and the space around you. Users lift their palm upward to access a floating map or face it downward to project a navigation path directly onto the sidewalk. These spatial commands turn the body itself into the controller, and the world responds in kind. These design choices take digital guidance from a tool we access into an extension of how we move through the world.

Benjamin Paruzynski, Senior Software Engineer at Trigger, built Throwing Darts, a fun, pass-and-play AR game that lets you place a virtual dartboard anywhere and play with friends. Using Spectacles’ hand tracking and world anchoring, this app brings lightweight, social play into everyday spaces. The use of your hands makes it feel like any other game of darts, and that’s really where the magic lies. Your body drives the experience just like it would in real life. You’re not tapping a screen or pressing buttons on a controller. You’re lining up your shot and throwing your arm, aiming for that bullseye. It’s simple, physical, and familiar, demonstrating how AR can make digital play feel more like something you perform, rather than something you operate.

Built in just two weeks by Hongming Li, Christian Enriquez, and Maxim Safioulline, SnapSEEK envisions a world where your hands become the camera. The Spectacles app, developed as part of Reality Hacks, enables users to use their hands to make an intuitive picture-taking gesture, capturing things in the world around them as part of a scavenger hunt. These images are then sent to ChatGPT Vision to process what was captured and turn it into data for a learning experience or a game.

The pinch and expand gesture is novel and addictive. It also lends itself to unlocking many forms of play. The SnapSEEK team demonstrated how capturing and identifying real-world objects could be used for anything from a story-driven quest to a multiplayer puzzle. The team turns photo taking on its head, moving it from a mechanism to store memories to an active part of experience creation.

Everyday Magic

One of the most striking powers of augmented reality is its ability to make the ordinary extraordinary, essentially bringing magic into our everyday lives. As a technology that plays with perception, AR has the potential to allow us to see and experience our lives from a variety of different lenses, which will enable us to re-experience things we do in our lives in an infinite number of ways.

Stephanie Enriquez, a design engineer exploring the intersection of code and immersive tech, reimagined restaurant dining using Snap Spectacles in her latest prototype. When the glasses detect a printed menu, they produce vivid 3D renderings of the menu items that can be viewed from different angles right in front of the diner. Enriquez used Luma AI to scan actual dishes from her favorite restaurant, Oven and Shaker, as part of this prototype. The combination of lifelike models and AR glasses lets guests browse menus and meals in 3D rather than having to rely on flat 2D menus. This is not only a whimsical way to experience dining, but it also has practical applications such as improving accessibility and reducing decision fatigue.

In a playful and practical prototype shared by SunfloVR, Snap Spectacles becomes a window into the secret lives of your houseplants. The Plant-a-Pal AR Lens brings plants alive and turns them into characters you can talk to and even dress up. The Spectacles app utilizes AI to identify plant species, assess their health, and generate personalized voices. This gives each plant a unique character that tells you when it requires light, water, or attention. The concept is a great example of how inanimate objects can come to life using spatial computing. And how AR can soften the edges of utility, moving us away from dashboards and data to an experience that informs us with delight. Chores, like plant care, can become conversational, emotional, and even delightful.

In a moment captured by Andrew Seleznov, Head of AR Production at Snap’s Arcadia Creative Studio, Spectacles made an appearance at the Jersey City Marathon. Wearing Spectacles and using the Path Pioneer app, Seleznov joined his friend for a short stretch of his race to both show his support and show us the future of sports with AR glasses. The app overlays holographic arrows on the streets in front of you to show you which direction you should run while on-screen stats, including lap time, lap count, and pace, are also displayed. "Great time to be alive. The future has arrived - and it's quietly integrating into our daily routines," said Seleznov in his LinkedIn post. The Path Pioneer app offers a glimpse into how AR can give sport and fitness routines a fresh perspective.

Conversational & AI-Driven AR

Generative AI and voice interfaces are reshaping how we access information. On Spectacles, that means asking questions and getting contextual answers that live in space, not on screens. These experiences suggest a future where the assistant isn’t in your pocket, it’s in your field of view, ready to help in real time.

Watch this short experiment of people speaking different languages and using Spectacles to communicate using the latest AI-powered translation experience on Spectacles

Ralph Barbagallo, an award-winning spatial computing developer, recently combined Snap Spectacles and ChatGPT to prototype a voice-driven AR cooking assistant. Created as part of the Snap Spectacles Community Challenge, the app lets users ask for recipes and see instructions pop up in their environment on transparent windows in real-time. The experiment combines voice and visuals in a heads-up, hands-free format, which is ideal for cooking. But this setup can go beyond making meals. As Barbagallo notes, it opens the door for developers to create AI agents tailored for Spectacles and other emerging wearable platforms. His open-source approach makes this even easier, offering a valuable foundation for anyone wanting to explore voice interfaces on the device.

Just like palm-based navigation reimagines how we move through space, conversational AR interfaces are reshaping how we find what we need. In this demo by EyeJack, Co-Founder Lukasz Karluk shows how you can use AR and AI to talk to books. He asks The Next Dimension a question just by looking at the book, which brings up a chat modal. Using his voice, he talks to the book, asking it questions, and information is provided back both audibly and in written form within the chat-style interface. "It’s a great example of how fuzzy, semantic search can save time compared to manually flipping through pages," he said in his LinkedIn post. Now imagine this approach applied to product discovery: asking about an item on a store shelf, getting specs or reviews in place, or surfacing alternatives based on what you’re looking at. EyeJack’s prototype hints at a future where discovery is intuitive, multimodal, and spatial. It is less about searching and more about finding what you need, right when and where you need it.

Technical designer and product lead Conway Anderson created a short film from his perspective showing how AR glasses can gamify everyday activities. The glasses display a list of household chores with a progress bar. Each time a chore is completed, the task is checked off, the progress bar advances, and Anderson earns points for these efforts. The app uses AR to provide a visual overlay, while visual AI identifies the tasks to automatically check them off and award points as a prize.

Orlando Mathias, a design leader exploring the intersection of AI and spatial computing, has developed a prototype that asks a powerful question: do we need any more devices or can any objects do anything we desire? Built with Snap Lens Studio for Spectacles, his project shows how a simple 3D-printed block can become the surface for a variety of different experiences, all prompted through an AI agent. In this demo, Mathias asks the agent to give him an update on what his children are doing. The block transforms into a baby monitor or a 3D live map to provide him with the information he needs. In the same demo, the block becomes the canvas to watch a movie or to showcase a 3D model of a sneaker to buy.

By combining real-time AR with AI-driven reasoning, the prototype explores how everyday items could respond to voice, touch, and spatial context to support dynamic, adaptive interactions. It’s an early study in what Mathias calls Blended Products, tools that bring AI into the physical world in tangible, intuitive ways. Rather than separating users from their environments through screens, this approach invites them to engage with objects that think and respond in space.

Real-World Play

AR moves casual and multiplayer games off the screen and into the world around us. These experiments show how spaces around us become areas for play. And how physical toys and devices can be reimagined when paired with AR technology.

Tech-enabled storyteller, Kavin Kumar, created an AR fishing game that turns the world in front of you into a frozen lake and your phone into a fishing rod. Built in collaboration with XR developer Nithin Shankar, players use swipe gestures on the phone to cast and reel in fish from the lake. Swipe down to drop the rope, swipe up to bring in the fish. This casual and familiar game is brought into the space in front of you, challenging you to catch as many virtual fish as possible in 60 seconds. Kumar employed the use of physics, dynamic mesh, and custom shaders to make the fishing experience as realistic as possible. Kumar’s prototype reimagines the role of our smartphone in the face of AR, making it part of the input rather than the focus of play.

XR builder, Krunal MB Gediya, is also merging the physical and digital in bold new ways with ARcade Racer. He used Snap Spectacles to turn a real RC car into an immersive gaming experience. Players place AR coins and virtual obstacles in their space to construct a race course. They then pick up the RC car controller and race the car around the room, trying to avoid the obstacles and claiming the coins to win. The game also uses a custom 3D-printed phone mount for the RC car, which is used to overlay a virtual driver onto the physical car and is in sync with the experience. Gediya’s lens redefines what real-world play can look like through AR glasses and a mechanical toy.

Design engineer, Matthew Hallberg, used the Spectacles BLE API to prototype an experience with an AR toy that was built for mobile augmented reality but can now be fully realized with AR smartglasses. The game lets you shoot life-size targets such as plant pots and statues that are placed in your physical space. Or zap a swarm of bees that are flying around your room. It's a fun example of how bridging physical accessories to AR eyewear can create a visceral experience that moves beyond using your imagination.

AR as a Control Layer

AR glasses are becoming control surfaces for the world around us. Whether guiding a robot arm or enabling remote support with spatial annotations, Spectacles ’24 experiments hint at a new interface where what you see and how you move can drive complex systems.

In one of his latest Spectacles prototypes, Krunal MB Gediya shows how the glasses can function as a real-time collaboration tool for remote assistance. The app streams a live camera feed from the glasses to a custom web portal via WebSockets, giving remote experts the ability to see exactly what the wearer sees and annotate directly into their view. Using Snap’s Instant World Hit Test, those annotations lock to 3D space for more precise, spatially relevant guidance. Speech-to-text and text-to-speech playback create a low-latency, two-way feedback loop between the wearer and the helper. While a prototype, the result hints at Spectacles’ potential for hands-free support in fields from maintenance to training. It also surfaces a broader opportunity as native WebRTC support could open the door to richer, peer-to-peer AR workflows. Experiments like this illustrate how wearables can make remote communication more collaborative.

At the Mistral AI x Hugging Face Robotics Hackathon, Vincent Trastour and team explored a bold idea to use Snap Spectacles to control robots. By turning the AR glasses into a real-time communication bridge, they enabled a robot to receive data directly from the wearer’s perspective, including what they saw, how they moved, and even what they meant to do. The setup included a custom web server linking Spectacles to a robotic arm, with LLMs analyzing video footage to guide mimicked behaviors. In just 48 hours, they developed control policies and generated datasets using tools such as LeRobot and AgileX. This experiment suggests utilizing spatial vision and natural human movement to control physical AI. With Spectacles as the eyes and an intention-capture device, robots can become more intuitive collaborators, guided not by code, but by context.

Each of these experiments, whether playful or practical, offers a glimpse into what becomes possible when the world itself becomes the interface. As we move toward a world beyond the smartphone, these glasses show how computing can integrate into our everyday lives, rather than interrupt them.