đ Remix Reality Insider: Simulation Goes Scalable
Your premium drop on the systems, machines, and forces reshaping reality.
đ°ď¸ The Signal
This weekâs defining shift.
Immersive training is real-world ready.
Simulation has always been one of XRâs core capabilities. AR and VR were made for training, and these immersive technologies have a longstanding history in this use case. Whatâs changing now is scale. Pilots are turning into full deployments, with major organizations embracing XR as a core part of their training programs across industries. This shift comes as early adopters begin sharing real data on its effectiveness. The results show that XR is outperforming traditional methods, making it a clear contender for the future of corporate education.
This weekâs spatial computing news surfaced signals like these:
- Nightingale College found that VR training led to a 5.9% increase in test scores and a 22% reduction in assignment completion time, while also cutting costs by 40%.
- Miami University launched an AI-powered XR training program for manufacturing workers, funded by a $1.5M state grant. The system uses generative AI and AR/VR to deliver personalized safety simulations and real-time coaching.
- NASA tested mixed reality flight sims with 12 pilots inside the worldâs largest vertical motion simulator. The study aims to help regulators certify MR tools for widespread adoption.
- Brussels Airlines became the first in the Lufthansa Group to use VR as a standalone tool in pilot training. It replaces screen-based cockpit tools and is approved for use in official training.
- West Midlands Police, in partnership with Calico, launched a VR experience that places new recruits in a womanâs shoes to build empathy around gender-based violence. The initiative is expected to reach about 700 recruits by spring 2026.
Why this matters: XR is proving to be a serious upgrade over traditional screen-based training. It delivers better outcomes, reduces costs, and scales in ways that older methods simply canât. Major organizations, such as NASA and Lufthansa, are adopting XR and actively measuring its impact to solidify it as a core part of their operations. This signals a shift that XR training is ready to scale across various industries.
đ§ Reality Decoded
Your premium deep dive.
In the era of spatial computing, youâre no longer on the outside looking in. The screen fades away, and you step into it, not just as a user, but as an integral part of the system itself.
This shift redefines your role in computing. Here's how:
- You, the Input: Spatial computing needs you to function. Your presence provides the context: where you are, what you're doing, who you're with. It collects data from your body and surroundings to make sense of the moment and to place content meaningfully in your space. Without you, the system has nothing to see, understand, or respond to.
- You, the Controller: Wearables transform your body into the interface. You control systems with your hands, navigate with your eyes, and activate actions with your voice. Thereâs no need to swipe or tap, just move naturally and speak as you normally would. Youâre no longer operating devices with peripherals but have become the control system itself.
- You, the Agent: Even in a world powered by AI, you're still in the loop. You assign tasks, set direction, and shape outcomes. Whether it's guiding robots, charting your own course in a virtual world, or telling autonomous vehicles where to go, the system is taking cues from you as the one driving it.
As this new layer of computing settles into our lives, the most powerful interface won't be the hardware we build or the software that powers it; it will be us, augmented, aware, and in control.
Key Takeaway:
Spatial computing depends on you. It uses your presence for context, your body for control, and your intent to guide actions. You are not a bystander, you are the systemâs foundation.
đĄ Weekly Radar
Your weekly scan across the spatial computing stack.
đ¤ UBTECH Debuts Humanoid Robot That Powers Itself Around the Clock
- UBTECHâs industrial humanoid robot now performs self-directed, hot-swappable battery changes.
- Why this matters: No matter how fast spatial computing evolves, energy remains the constraint. UBTECHâs approach flips the problem, not by extending battery life, but by teaching robots to handle recharging themselves.
đ Amazon Acquires AI Wearable Startup Bee
- Amazon has acquired Bee, a company focused on personal, ambient intelligence, according to the company in a LinkedIn post.
- Why this matters: Wearable tech is proving to be the ears and eyes of AI. In this case, Bee equips AI with the ability to hear what is going on in the physical real world to enable it to be smarter and more helpful.
đ§ Meta Publishes Groundbreaking Research on Neural Wrist Input
- Metaâs Reality Labs detailed a wrist-worn sEMG system that translates muscle signals into device commands, enabling seamless HCI.
- Why this matters: As computers move into the space around us, our interactions are expected to feel more natural and use more of our bodies. Meta is betting on EMG to help make that shift.
𧤠Sharp Opens Pre-Registration for Prototype VR Haptic Controller
- Sharp is accepting early sign-ups for a dual-hand VR haptic controller that enables users to feel objects in virtual environments.
- Why this matters: Sharp is building the device openly and involving customers from the very early stages. The goal is to utilize customer feedback to inform the product roadmap, aiming to deliver a device that achieves immediate market fit.
⥠Meta Updates Haptics Studio With New Design and Creative Tools
- Meta has redesigned its Haptics Studio to improve usability and speed up tactile content creation.
- Why this matters: As MR and VR aim to immerse users by making digital environments feel real, they need more than just visuals. Haptics add a tactile layer that enhances presence and brings the sense of touch into the experience.
âď¸ BQP Raises $5M to Scale Quantum-Enhanced Digital Twin Platform
- BQP has secured $5 million in seed financing to expand BQPhy, its digital twin platform designed to run across CPUs, GPUs, and quantum systems.
- Why this matters: Quantum computing will not only let digital twins run simulations faster but also enable these systems to identify patterns we have never been able to catch, and respond before things happen.
đŚ Amazon Investment Fuels Lumotiveâs Push to Replace Bulky Optics With Smart Chips
- Lumotive's Light Control Metasurface (LCM) chip utilizes software to steer light, eliminating mechanical parts in next-generation sensing systems.
- Why this matters: Lumotive is transforming how machines see by making optical systems smaller and easier to scale. Its chip uses software to control light, so performance can be tuned without changing the hardware.
⪠Notre Dame to Be Preserved as a Digital Twin in Microsoft-Led Project
- Microsoft and Iconem will create a detailed digital replica of Notre Dame in partnership with the French Ministry of Culture.
- Why this matters: By preserving the real world in digital form, spatial technology opens new possibilities, including supporting virtual tourism, aiding planning, advancing historical research, and creating lasting records of todayâs physical reality.
đ Tom's Take
Unfiltered POV from the editor-in-chief.
I never thought I would say this, but I feel more comfortable getting in a car with a robot at the wheel than I do with a human. After months of riding a Waymo here in San Francisco, it has become my preferred way to travel.
I've had too many ride-sharing trips with Black Ice car fresheners, drivers talking incessantly on calls for the entire ride, and "comfort" rides that end up being hot and chatty rather than cool and quiet as requested.
Robotaxis are consistent, quiet, and provide an environment that is all under my control. These benefits quickly overrode the scary fact that there is, in fact, no one at the wheel. And believe me, watching a car drive itself down a San Francisco hill is one hell of a ride.
While it is comfort and convenience for me, for many, it is also safety. Iâve talked to a number of women who say they feel more secure riding in self-driving cars. Theyâre not alone. Uber recently launched a feature in select US cities that lets women riders match with woman drivers, a move that highlights ongoing safety concerns among female users.
Parents are also starting to choose Waymo to take their kids to school. In Phoenix, Waymo now allows teens aged 14â17 to take solo rides using linked family accounts.
Thereâs a lot to unpack when we already trust robots over humans. It says something about the state of society and the services run by it. But it also raises a bigger question: are we embracing automation too quickly simply because the human alternative has let us down?
đŽ Whatâs Next
3 signals pointing to whatâs coming next.
- Robotaxis are rolling off the line
Self-driving taxis are becoming real products with production pipelines, deployment targets, and nationwide ambitions. Pony.aiâs Gen-7 robotaxis just hit the streets after entering mass production with GAC and BAIC, cutting hardware costs by 70% and using fully automotive-grade components. Waymo crossed 100 million autonomous miles and is now operating in five US cities with two more on the horizon. Uberâs partnership with Lucid and Nuro plans to bring 20,000 AI-powered robotaxis to global markets over the next six years. This all spells more autonomous vehicles ready to be hailed for your next ride in more places. - Privacy is the pause button on spatial computing
No matter how impressive the tech, spatial computing tools wonât be fully adopted unless people feel safe using them. A new study in the Journal of Retailing and Consumer Services shows that privacy concerns significantly reduce the effectiveness of AR try-on apps, undermining user confidence and purchase intent by increasing cognitive strain. Similarly, a YouGov/Omdia survey found that 66% of Americans feel uneasy riding in autonomous vehicles, with safety and data concerns topping the list. The success of immersive and autonomous tech depends as much on emotional design and transparency as it does on technical breakthroughs. - The physical world is AIâs next frontier
The next leap in AI wonât come from bigger models alone. Itâll come from giving those models access to the real world. Amazonâs acquisition of Bee, a startup building ambient AI wearables, reflects a growing trend of AI agents that can see, hear, and respond to the physical environment in real-time. BrightAI is pushing this further, layering sensors and edge AI into infrastructure systems like HVAC and water to create a real-time OS for the built world. Whether itâs a wrist-worn assistant or an invisible maintenance layer, for AI to evolve, it needs spatial context.
đ Youâve unlocked this drop as a Remix Reality Insider. Thanks for helping us decode whatâs next and why it matters.
đŹ Make sure you never miss an issue! If youâre using Gmail, drag this email into your Primary tab so Remix Reality doesnât get lost in Promotions. On mobile, tap the three dots and hit âMove to > Primary.â Thatâs it!