🔓 Remix Reality Insider: From Prototype to Platform

🔓 Remix Reality Insider: From Prototype to Platform
Source: Midjourney - generated by AI

Your premium drop on the systems, machines, and forces reshaping reality.

🛰️ The Signal

This week’s defining shift.

Physical AI is moving from prototype to platform.

For years, humanoids and service robots felt like science projects limited by clunky hardware and brittle software. That bottleneck is breaking as NVIDIA’s Jetson Thor goes from launch to rapid adoption. The compute platform is already anchoring a growing ecosystem of partners, developers, and use cases, making robotics less about individual breakthroughs and more about a shared stack that scales.

This week’s spatial computing news surfaced signals like these:

  • NVIDIA made Jetson Thor generally available, delivering a 7.5x AI boost and support for multimodal, generative models at the edge.
  • Galbot integrated Jetson Thor into its humanoid G1 Premium, now rolling out across pharmacies in China.
  • Infineon joined the Jetson Thor ecosystem, combining its sensors and controllers to streamline motion systems for humanoids.
  • RealSense launched its new depth camera with native integration across NVIDIA’s Thor, Isaac, and Holoscan platforms, giving robots vision tuned for real-world autonomy.
  • Telexistence announced its Motion Data Factory, treating robotics training data as infrastructure, another piece of the puzzle to accelerate robot intelligence.

Why this matters: Robotics is crossing into a new phase. We now have the brain in advanced compute, the eyes in new sensing systems, and the body in humanoids and autonomous machines. The full stack is starting to come together, giving us the pieces needed to build and scale robots that can operate in the real world.


🧠 Reality Decoded

Your premium deep dive.

San Francisco is one of the first cities where robotaxis already feel commonplace. Waymo cars are out in full force, and riders are starting to see them as a real option, not a demo. What makes them stand out is not only the tech but the ride experience itself.

In our latest editorial, we break down the four reasons robotaxis will replace your rideshare.

  • Privacy is immediate when there is no driver. The space feels yours, free from small talk or distractions.
  • Safety comes from how predictably these cars behave and from the peace of mind many women, parents, and teens already value in not having to drive with a stranger.
  • Personalization is growing, from greetings and music playlists today to opportunities like mood-based environments tomorrow.
  • And with no one behind the wheel, or in some cases, no wheel at all, the cabin becomes usable space where you can work, connect, or relax, making better use of your time.

AVs are not just changing how we get around. They are changing what a ride can be. As these services expand into more cities, the value will be measured not only in miles driven, but also in the quality of time spent inside.

Key Takeaway:
The shift to robotaxis won’t just be about proving the tech. It will hinge on the ride itself. Privacy, safety, personalization, and the ability to use your time better are what will make people choose autonomy again and again.

📡 Weekly Radar

Your weekly scan across the spatial computing stack.

PHYSICAL AI

🌊 Aquanaut Mark 2 Hits 2,300-Meter Milestone in Untethered Deepwater Test

  • Aquanaut Mark 2 reached a depth of 2,300 meters during a test in the Gulf of Mexico.
  • Why this matters: This is a major milestone for Nauticus as it works toward its goal of cutting costs for oil and gas customers by up to 40% using its autonomous, untethered systems.

💊 OPTEL Acquires Vanguard Robotics to Expand Pharma Cobot Capabilities

  • OPTEL has acquired Vanguard Robotics to integrate collaborative robotic automation into its pharmaceutical manufacturing solutions.
  • Why this matters: Pharma is buying robotics companies as physical AI becomes necessary infrastructure to compete and stay ahead. This shift shows automation is no longer optional but a core capability for manufacturing efficiency and resilience.
IMMERSIVE INTERFACES

✈️ Loft Dynamics Raises $24M to Expand VR Pilot Training Across Airlines

  • New funding will support the rollout of VR airline simulators and the development of a spatial computing–powered home training kit.
  • Why this matters: VR flight training is proving its value in real-world aviation. Backing from airlines and major investors suggests broader adoption across the industry is happening.

👓 Rokid Launches AI & AR Glasses With Strong Kickstarter Momentum

  • Rokid has introduced lightweight smart glasses combining AR visuals and AI-driven tools for real-time interaction.
  • Why this matters: Rokid is the latest to enter the AI glasses race, but it's setting itself apart with a monochromatic waveguide display that delivers visual information alongside voice-based assistance. By choosing a single-color output, Rokid is signaling a focus on clarity, context, and battery efficiency, less about entertainment, more about utility.
SIMULATED WORLDS

🌞 IBM and NASA Create a Digital Twin of the Sun to Predict Solar Disruptions

  • Surya is the first foundation model trained to simulate the Sun’s behavior using satellite imagery and magnetic data.
  • Why this matters: IBM and NASA’s earlier Prithvi models turned satellite data into a digital twin of Earth to support weather and climate prediction. Now Surya does the same for the Sun. These models help us plan for what’s next and open up entirely new ways to see our world and universe through data.

👨‍💻 Meta Expands GenAI Toolkit with Environment Generation and Embodied NPCs

  • Environment Generation is now available in the Worlds Desktop Editor, building on Meta’s existing GenAI tools.
  • Why this matters: GenAI is reshaping how worlds are built, and Meta is doubling down to speed up development for Horizon Worlds creators. Embodied NPCs and environment generation extend the powerful tools already rolling out in the Worlds Editor.
PERCEPTION SYSTEMS

⚡ VoxelSensors Partners with Qualcomm to Deliver Ultra-Low-Power 3D Sensing for XR

  • VoxelSensors will optimize its SPAES 3D sensing technology with Qualcomm’s Snapdragon AR2 Gen 1 platform.
  • Why this matters: The problem with most 3D sensors is that they suck up power and need bulky hardware to work. If AR glasses are ever going to be something people wear all day, the tech inside has to get way smaller and more efficient. This is what SPAES is solving.

🚗 Nuro Raises $203M to Expand AI-Driven Autonomy and Global Robotaxi Partnerships

  • Nuro closed a $203 million Series E round at a $6 billion valuation, adding new investors including Uber, NVIDIA, and Kindred Ventures.
  • Why this matters: This funding from Uber and NVIDIA suggests a deepening of their relationship with Nuro, which could enrich the technology and help scale the solution globally.

🌀 Tom's Take

Unfiltered POV from the editor-in-chief.

I somehow got on the side of TikTok where delivery robots are in danger. These videos usually show one of them trying to cross a busy street in LA, with the person recording rooting for it to make it to the other side. Sometimes they succeed. Too often they don’t. You watch as human-driven cars cut them off, block their path, or even run them over. The cheers turn into gasps.

It got me thinking about how messy this in-between time really is. We’re still driving while robots are just starting to share the streets. Robots thrive on predictability. Humans bring chaos. For a delivery bot or an AV, the hardest part of the trip isn’t the road or the weather. It’s us. We’re the wild card.

That tension raises an uncomfortable but important question. Could we start to see areas where only robots are allowed to take the wheel? Dedicated AV lanes have been talked about and, in some areas, are being planned, but you can imagine how much more effective it would be if they were fully rolled out. An environment built for stability rather than surprises. In the same way bike lanes gave cyclists a chance to safely move through cities, AV-only lanes could give robots a fighting chance to do the same.

The shift wouldn’t happen overnight. Cities don’t reconfigure themselves easily. But the logic is there. If autonomy is here to stay, our streets may need to adapt. And maybe the real test won’t just be whether robots can handle us. It will be whether we can make space for them.


🔮 What’s Next

3 signals pointing to what’s coming next.

  1. Cities Opening to Autonomy
    Autonomous vehicles are moving deeper into the urban core. New York City has cleared Waymo to begin its first-ever AV testing on the streets of Manhattan and Downtown Brooklyn, a milestone in one of the world’s most complex traffic environments. At the same time, Shenzhen has launched its first fully driverless Robobus line, with WeRide vehicles running Level 4 autonomy across a central district route. These pilots show how cities are starting to reshape policy and infrastructure to integrate AVs directly into daily transit, from limited testing in New York to full-scale service in Shenzhen.
  2. Delivery Robots Take Various Forms
    Delivery is diversifying beyond simple sidewalk bots. In Zurich, Just Eat Takeaway.com is piloting hybrid robots from RIVR that combine wheels and legs, letting them climb curbs and stairs while handling the complexity of European city streets. In Dallas, Chipotle has launched Zipotle, a drone delivery program with Zipline that flies meals directly to homes, parks, and backyards. These experiments point to a future where delivery is no longer limited to cars or scooters. Instead, fleets of specialized machines, wheeled, legged, and airborne, will take on the last mile tailored to the local environment.
  3. The Consumer XR Wave Builds
    More consumer devices are showing up that bring spatial computing into daily use. Rokid’s AI and AR glasses just launched on Kickstarter and have picked up strong early backing, offering real-time translation, navigation, and assistance in a 49g frame. At the same time, smartphone maker vivo has entered the market with the Vision Discovery Edition, a mixed reality headset featuring dual 8K displays, eye tracking, and gesture recognition. Both of these devices point to a growing push to position XR hardware not only as entertainment devices but as daily companions for productivity, travel, and communication.

🔓 You’ve unlocked this drop as a Remix Reality Insider. Thanks for helping us decode what’s next and why it matters.

📬 Make sure you never miss an issue! If you’re using Gmail, drag this email into your Primary tab so Remix Reality doesn’t get lost in Promotions. On mobile, tap the three dots and hit “Move to > Primary.” That’s it!