🔓 Remix Reality Insider: Simulation Goes Scalable

🔓 Remix Reality Insider: Simulation Goes Scalable
Source: Midjourney - generated by AI

Your premium drop on the systems, machines, and forces reshaping reality.

🛰️ The Signal

This week’s defining shift.

Immersive training is real-world ready.

Simulation has always been one of XR’s core capabilities. AR and VR were made for training, and these immersive technologies have a longstanding history in this use case. What’s changing now is scale. Pilots are turning into full deployments, with major organizations embracing XR as a core part of their training programs across industries. This shift comes as early adopters begin sharing real data on its effectiveness. The results show that XR is outperforming traditional methods, making it a clear contender for the future of corporate education.

This week’s spatial computing news surfaced signals like these:

  • Nightingale College found that VR training led to a 5.9% increase in test scores and a 22% reduction in assignment completion time, while also cutting costs by 40%.
  • Miami University launched an AI-powered XR training program for manufacturing workers, funded by a $1.5M state grant. The system uses generative AI and AR/VR to deliver personalized safety simulations and real-time coaching.
  • NASA tested mixed reality flight sims with 12 pilots inside the world’s largest vertical motion simulator. The study aims to help regulators certify MR tools for widespread adoption.
  • Brussels Airlines became the first in the Lufthansa Group to use VR as a standalone tool in pilot training. It replaces screen-based cockpit tools and is approved for use in official training.
  • West Midlands Police, in partnership with Calico, launched a VR experience that places new recruits in a woman’s shoes to build empathy around gender-based violence. The initiative is expected to reach about 700 recruits by spring 2026.

Why this matters: XR is proving to be a serious upgrade over traditional screen-based training. It delivers better outcomes, reduces costs, and scales in ways that older methods simply can’t. Major organizations, such as NASA and Lufthansa, are adopting XR and actively measuring its impact to solidify it as a core part of their operations. This signals a shift that XR training is ready to scale across various industries.


🧠 Reality Decoded

Your premium deep dive.

In the era of spatial computing, you’re no longer on the outside looking in. The screen fades away, and you step into it, not just as a user, but as an integral part of the system itself.

This shift redefines your role in computing. Here's how:

  • You, the Input: Spatial computing needs you to function. Your presence provides the context: where you are, what you're doing, who you're with. It collects data from your body and surroundings to make sense of the moment and to place content meaningfully in your space. Without you, the system has nothing to see, understand, or respond to.
  • You, the Controller: Wearables transform your body into the interface. You control systems with your hands, navigate with your eyes, and activate actions with your voice. There’s no need to swipe or tap, just move naturally and speak as you normally would. You’re no longer operating devices with peripherals but have become the control system itself.
  • You, the Agent: Even in a world powered by AI, you're still in the loop. You assign tasks, set direction, and shape outcomes. Whether it's guiding robots, charting your own course in a virtual world, or telling autonomous vehicles where to go, the system is taking cues from you as the one driving it.

As this new layer of computing settles into our lives, the most powerful interface won't be the hardware we build or the software that powers it; it will be us, augmented, aware, and in control.

Key Takeaway:
Spatial computing depends on you. It uses your presence for context, your body for control, and your intent to guide actions. You are not a bystander, you are the system’s foundation.

📡 Weekly Radar

Your weekly scan across the spatial computing stack.

PHYSICAL AI

🤖 UBTECH Debuts Humanoid Robot That Powers Itself Around the Clock

  • UBTECH’s industrial humanoid robot now performs self-directed, hot-swappable battery changes.
  • Why this matters: No matter how fast spatial computing evolves, energy remains the constraint. UBTECH’s approach flips the problem, not by extending battery life, but by teaching robots to handle recharging themselves.

🐝 Amazon Acquires AI Wearable Startup Bee

  • Amazon has acquired Bee, a company focused on personal, ambient intelligence, according to the company in a LinkedIn post.
  • Why this matters: Wearable tech is proving to be the ears and eyes of AI. In this case, Bee equips AI with the ability to hear what is going on in the physical real world to enable it to be smarter and more helpful. 
IMMERSIVE INTERFACES

🧠 Meta Publishes Groundbreaking Research on Neural Wrist Input

  • Meta’s Reality Labs detailed a wrist-worn sEMG system that translates muscle signals into device commands, enabling seamless HCI.
  • Why this matters: As computers move into the space around us, our interactions are expected to feel more natural and use more of our bodies. Meta is betting on EMG to help make that shift.

🧤 Sharp Opens Pre-Registration for Prototype VR Haptic Controller

  • Sharp is accepting early sign-ups for a dual-hand VR haptic controller that enables users to feel objects in virtual environments.
  • Why this matters: Sharp is building the device openly and involving customers from the very early stages. The goal is to utilize customer feedback to inform the product roadmap, aiming to deliver a device that achieves immediate market fit.
SIMULATED WORLDS

⚡ Meta Updates Haptics Studio With New Design and Creative Tools

  • Meta has redesigned its Haptics Studio to improve usability and speed up tactile content creation.
  • Why this matters: As MR and VR aim to immerse users by making digital environments feel real, they need more than just visuals. Haptics add a tactile layer that enhances presence and brings the sense of touch into the experience.

⚛️ BQP Raises $5M to Scale Quantum-Enhanced Digital Twin Platform

  • BQP has secured $5 million in seed financing to expand BQPhy, its digital twin platform designed to run across CPUs, GPUs, and quantum systems.
  • Why this matters: Quantum computing will not only let digital twins run simulations faster but also enable these systems to identify patterns we have never been able to catch, and respond before things happen.
PERCEPTION SYSTEMS

🔦 Amazon Investment Fuels Lumotive’s Push to Replace Bulky Optics With Smart Chips

  • Lumotive's Light Control Metasurface (LCM) chip utilizes software to steer light, eliminating mechanical parts in next-generation sensing systems.
  • Why this matters: Lumotive is transforming how machines see by making optical systems smaller and easier to scale. Its chip uses software to control light, so performance can be tuned without changing the hardware.
SOCIETY & CULTURE

⛪ Notre Dame to Be Preserved as a Digital Twin in Microsoft-Led Project

  • Microsoft and Iconem will create a detailed digital replica of Notre Dame in partnership with the French Ministry of Culture.
  • Why this matters: By preserving the real world in digital form, spatial technology opens new possibilities, including supporting virtual tourism, aiding planning, advancing historical research, and creating lasting records of today’s physical reality.

🌀 Tom's Take

Unfiltered POV from the editor-in-chief.

I never thought I would say this, but I feel more comfortable getting in a car with a robot at the wheel than I do with a human. After months of riding a Waymo here in San Francisco, it has become my preferred way to travel.

I've had too many ride-sharing trips with Black Ice car fresheners, drivers talking incessantly on calls for the entire ride, and "comfort" rides that end up being hot and chatty rather than cool and quiet as requested.

Robotaxis are consistent, quiet, and provide an environment that is all under my control. These benefits quickly overrode the scary fact that there is, in fact, no one at the wheel. And believe me, watching a car drive itself down a San Francisco hill is one hell of a ride.

While it is comfort and convenience for me, for many, it is also safety. I’ve talked to a number of women who say they feel more secure riding in self-driving cars. They’re not alone. Uber recently launched a feature in select US cities that lets women riders match with woman drivers, a move that highlights ongoing safety concerns among female users.

Parents are also starting to choose Waymo to take their kids to school. In Phoenix, Waymo now allows teens aged 14–17 to take solo rides using linked family accounts.

There’s a lot to unpack when we already trust robots over humans. It says something about the state of society and the services run by it. But it also raises a bigger question: are we embracing automation too quickly simply because the human alternative has let us down?


🔮 What’s Next

3 signals pointing to what’s coming next.

  1. Robotaxis are rolling off the line
    Self-driving taxis are becoming real products with production pipelines, deployment targets, and nationwide ambitions. Pony.ai’s Gen-7 robotaxis just hit the streets after entering mass production with GAC and BAIC, cutting hardware costs by 70% and using fully automotive-grade components. Waymo crossed 100 million autonomous miles and is now operating in five US cities with two more on the horizon. Uber’s partnership with Lucid and Nuro plans to bring 20,000 AI-powered robotaxis to global markets over the next six years. This all spells more autonomous vehicles ready to be hailed for your next ride in more places.
  2. Privacy is the pause button on spatial computing
    No matter how impressive the tech, spatial computing tools won’t be fully adopted unless people feel safe using them. A new study in the Journal of Retailing and Consumer Services shows that privacy concerns significantly reduce the effectiveness of AR try-on apps, undermining user confidence and purchase intent by increasing cognitive strain. Similarly, a YouGov/Omdia survey found that 66% of Americans feel uneasy riding in autonomous vehicles, with safety and data concerns topping the list. The success of immersive and autonomous tech depends as much on emotional design and transparency as it does on technical breakthroughs.
  3. The physical world is AI’s next frontier
    The next leap in AI won’t come from bigger models alone. It’ll come from giving those models access to the real world. Amazon’s acquisition of Bee, a startup building ambient AI wearables, reflects a growing trend of AI agents that can see, hear, and respond to the physical environment in real-time. BrightAI is pushing this further, layering sensors and edge AI into infrastructure systems like HVAC and water to create a real-time OS for the built world. Whether it’s a wrist-worn assistant or an invisible maintenance layer, for AI to evolve, it needs spatial context.

🔓 You’ve unlocked this drop as a Remix Reality Insider. Thanks for helping us decode what’s next and why it matters.

📬 Make sure you never miss an issue! If you’re using Gmail, drag this email into your Primary tab so Remix Reality doesn’t get lost in Promotions. On mobile, tap the three dots and hit “Move to > Primary.” That’s it!