🔓 Remix Reality Insider: Fashion Becomes the Front Door to Wearables
Your premium drop on the systems, machines, and forces reshaping reality.
🛰️ The Signal
This week’s defining shift.
Fashion brands are the on-ramp for everyday wearables.
Instead of leading with technology, wearables must focus on identity, aesthetic, and form in order to be adopted. Fashion brands play a critical role in this next wave of computing. Smart features need to be woven into products people already want to wear, whether that’s eyewear for indoor training, rings for health tracking, or glasses designed to house your AI assistant. To succeed, wearables must feel less like gadgets and more like extensions of our personal style and routine.
This week’s news surfaced signals like these:
- Innovative Eyewear added new Reebok smartglasses designed for indoor training and hybrid sports, with features aimed at louder gym environments rather than outdoor use.
- Diesel released a limited-edition smart ring with Ultrahuman that combines health tracking with the fashion brand’s signature industrial look.
- Google confirmed it is working on AI glasses with a screen and without, with eyewear partners like Gentle Monster and Warby Parker, pointing to a stronger focus on fit, style, and everyday wear.
Why this matters: Wearables move our tech closer to the body and deeper into daily life. This raises the bar for how they must look, feel, and fit into social settings compared to the mobile and PC wave. This is where fashion brands come in. They understand these constraints better than most technology companies. Therefore, they play a critical role in shifting wearables from a niche tech product to something people desire.
🧠 Reality Decoded
Your premium deep dive.
Food manufacturing is becoming a key entry point for physical AI. High labor demand, long operating hours, and repeatable tasks create the conditions robots need to learn and deliver ROI. In a recent Remix Reality interview, Chef Robotics founder and CEO Rajat Bhageria explains why food is where robotics starts to make economic sense.
Three ideas from the conversation stand out.
- Scale is what makes physical AI work: Large manufacturers can run lines up to sixteen hours a day. This exposes robots to a volume of real ingredients, real failures, and constant variation. The live production environment is a necessary part to train the robot, which makes deployments in itself an engine of improvement.
- Assembly is the real bottleneck: Cooking gets the spotlight, but most food plants need people on assembly lines, repeating the same motions all day as output increases. That’s where shortages show up first, and why Chef Robotics starts with narrow assembly tasks and builds from there.
- A service model fits food production: Food operations change constantly. Ingredients come and go, menus change, and production lines get adjusted, sometimes several times a year. Chef Robotics works on a robots-as-a-service subscription model. This makes it easier for customers to adopt without committing to the upfront cost of buying a robot, while still matching automation spend to labor needs.
Key Takeaway:
Physical AI needs scale, repetition, and feedback from the real world. Food manufacturing provides all three. Chef Robotics is showing how robotics and today's AI is transforming the assembly line.
📡 Weekly Radar
Your weekly scan across the spatial computing stack.
✈️ NASA Backs Reliable Robotics to Test Autonomous Flights Near Airports
- Reliable Robotics will conduct autonomous flight tests using its Cessna 208B Caravan to simulate both standard and emergency operations in active airspace.
- Why this matters: Testing uncrewed systems near airports, with FAA, NASA, and industry coordination, shows UAS are transitioning from limited use to full integration in controlled airspace.
👂 Meta Acquires Limitless, Creator of AI Wearable Pendant
- Limitless, maker of the Pendant wearable, spent five years developing tools to augment human memory and focus.
- Why this matters: With its focus on memory and cognition, Limitless complements Meta’s existing AI wearables push, adding depth to a strategy already taking shape with devices like Ray-Ban Meta smart glasses.
🤖 Serve Robotics Brings Autonomous Delivery to Fort Lauderdale with Uber Eats
- Serve Robotics has extended its sidewalk robot service to Fort Lauderdale, enabling food deliveries through Uber Eats in key neighborhoods.
- Why this matters: Serve’s city-by-city expansion, anchored by its Uber Eats partnership, shows how robotic delivery is scaling through dense, food-driven markets.
🧤 SenseGlove Launches Exoskeleton Glove for Robotic Control and Force-Based Training
- SenseGlove has released the R1, an exoskeleton glove that enables real-time robotic control and force-based training through tactile feedback.
- Why this matters: The same glove technology that lets users interact with virtual objects in VR is now being used to train robots how to grip, hold, and handle the real world.
- Google confirmed it is working on two types of AI glasses with partners, one with a screen and the other without.
- Why this matters: It's clear that Google believes that success in XR is not achieved by one device alone. Today's update reiterates its plans across a variety of wearable devices, wired, wireless, glasses, and headsets. Sitting at the center of all of these is its operating system, Android XR, and AI, Gemini.
🧠 Vinci Raises $46M for AI System That Simulates Chips 1000x Faster
- Vinci’s software runs chip simulations up to 1000 times faster than traditional tools, without meshing or customer data.
- Why this matters: Vinci combines physics simulation, geometry processing, and AI into a single system built for real-world engineering. It reflects a broader move toward AI tools designed not for language or images, but for simulating and accelerating physical design work.
🎬 Journey Acquires Dimension to Strengthen Capabilities in Virtual and Immersive Production
- Dimension joins Journey’s network of studios focused on design, media, and next-generation content production.
- Why this matters: Dimension’s integration into Journey aligns with a shared focus on immersive, design-led experiences, and it could accelerate investment in the tech stack powering its volumetric and AI-driven productions.
⚕️ GE HealthCare and Mayo Clinic Launch GEMINI-RT Centered on Patient “Twinning” in Radiation Therapy
- GEMINI-RT uses imaging, AI, and monitoring to support tailored cancer treatment across planning, delivery, and follow-up.
- Why this matters: Imaging, patient modeling, and sensor-based monitoring all point to the growing role of spatial data in personalizing radiation therapy.
📷 Stereolabs Unveils ZED X One S and Core Cameras for Compact, Rugged Robot Vision
- Stereolabs has launched the ZED X One S and ZED X One Core, new vision systems tailored for constrained and harsh robotics environments.
- Why this matters: Perception hardware is evolving to match the physical realities of modern robotics. As machines shrink, toughen, and embed deeper into physical environments, sensors like these are being purpose-built to meet those exact demands.
📊 Momentum Unifies Wearable Data With Open API for Health Context and Intelligence
- Open-source API transforms fragmented wearable inputs into structured, context-aware health intelligence.
- Why this matters: Open Wearables pulls together sensor data from across devices and turns it into a structured format. It’s an essential step toward wearable solutions that can understand context, apply intelligence, and respond in real time.
🌀 Tom's Take
Unfiltered POV from the editor-in-chief.
Lately, I’ve seen a wave of reports declaring the metaverse is dead (again) and, by extension, questioning the future of VR. I think that framing misses what is actually happening. What’s fading is not the underlying technology, but a narrow idea of what the metaverse represents. The real shift is that computing is moving into the physical world, where the same stack now shows up as "spatial computing."
We are seeing momentum around physical AI, including robotics, autonomous vehicles, and AI wearables, and renewed interest in AR, especially as glasses feel even closer to reality. These systems operate in the real world, augmenting it rather than replacing it. But the truth behind it all is that they all use the same technologies that power VR. It's just that VR focuses more on moving further into the virtual world with the same ingredients. Computer vision, real-time 3D, sensors, and spatial mapping are a few of the foundational elements found in this next wave, from humanoid robots to smartglasses.
The confusion, I believe, comes from treating the new set of emerging technologies, including VR, AR, and AI, as separate categories competing for relevance. In reality, they are converging. Together, they form the next wave of computing. Spatial computing. This shift is about technology understanding space, context, and presence, and responding accordingly. This applies to VR as much as it does to AR, robotics, and the wave of autonomy across industries.
For me, the metaverse was never a single game, a virtual world, or an NFT economy. As I wrote in my 2023 trends, the metaverse was more of an aha moment. A realization that a new stack of technologies, including AI, AR, VR, IoT, and others, is working together to change our relationship with machines. That change anchors around presence. Where we become more present while using our technology, and more importantly, technology becomes more present in our physical reality: seeing, hearing, moving, and acting in the world alongside us.
The metaverse isn’t dead. It just stopped being the headline. And this gives it the opportunity to move beyond its original limited (and often misunderstood) definition to one that more properly illustrates where computing is moving. Spatial, embodied, and present in the physical world.
🔮 What’s Next
3 signals pointing to what’s coming next.
- Humanoid robots reach production and deployment
1X’s plan to deploy up to 10,000 robots through EQT Ventures portfolio companies; Mercado Libre’s live use of Agility Robotics' humanoid robot, Digit, in a fulfillment center; and AGIBOT surpassing 5,000 units produced all point to the same shift. Humanoid robots are moving beyond a demo into real operations, where they are being judged on reliability, utilization, and day-to-day performance in working environments. - Automakers are turning vehicles into intelligent spatial systems
Hyundai is preparing its compact autonomous robot, MobED, for production, and Kia debuted an AR-focused concept car for its 80th anniversary. Both of these are great examples of how automakers are applying autonomy, sensors, and software to move beyond traditional driving. Vehicles are being designed to sense their surroundings, adjust in real time, and support new ways for people to interact with them. - Global capital accelerates humanoid robot industrialization
Generative Bionics raised €70M in Europe, and EngineAI secured ¥1B in China, demonstrating that capital is flowing into humanoid robotics worldwide. In both cases, the funding is going into factories, production lines, and the work needed to get humanoid robots into wider use.
🔓 You’ve unlocked this drop as a Remix Reality Insider. Thanks for helping us decode what’s next and why it matters.
📬 Make sure you never miss an issue! If you’re using Gmail, drag this email into your Primary tab so Remix Reality doesn’t get lost in Promotions. On mobile, tap the three dots and hit “Move to > Primary.” That’s it!
🛠️ This newsletter uses a human-led, AI-assisted workflow, with all final decisions made by editors.