đ Remix Reality Insider: This is not a demo
Your premium drop on the systems, machines, and forces reshaping reality.
đ°ď¸ The Signal
This weekâs defining shift.
Modularity is emerging as the fast lane to autonomy.
Rather than relying on purpose-built systems, a growing number of companies are layering intelligence onto existing machines through retrofit kits, bolt-on software, or upgradable platforms. This modular approach isnât just flexible. It reduces friction, leverages whatâs already in the field, and speeds adoption by meeting industries and customers where they are.
This weekâs spatial computing news surfaced signals like these:
- Bedrock Robotics has raised $80M and exited stealth mode with its retrofitting of heavy equipment. It uses reversible kits that deliver full autonomy in a single day, transforming existing fleets into intelligent machines.
- Prontoâs acquisition of SafeAI combines fast-deploy autonomy with certified safety systems, offering a tiered solution for off-road automation. Its retrofit kits and smartphone-based control app minimize infrastructure changes and speed deployment.
- Lucid Motors is activating hands-free driving through software updates for vehicles ordered with its advanced sensor hardware. This shows how modular upgrades, when designed from the outset, can extend autonomy over time.
Why this matters: Modularity reframes autonomy as something you can layer on, not something you have to start from scratch. It works with the machines already in play, the systems already running, and the habits already formed. Thatâs how you move fast: not by replacing everything, but by building on whatâs already moving.
đ§ Reality Decoded
Your premium deep dive.
What if the next computing cycle isnât sparked by one device, but by a wave?
Over the past few months, reports and speculative roadmaps have suggested a startling volume of headworn devices are set to launch between now and 2028. Apple, Meta, Google, Samsung, ByteDance, Snap, and others are all in the mix. And while these products differ in form, function, and capabilities, they share one thing: theyâre all racing to claim space on your face.
Here are just a few highlights from our most recent deep dive, The Great Face Race: A Rundown of Reported and Rumored Devices:
- Apple is reportedly working on multiple Vision Pro updates, including a lighter âVision Air,â as well as their first AI smartglasses, all rumored to be landing between now and 2028.
- Meta is reportedly releasing AI glasses with a display and may be working on a lightweight XR headset, which, if the rumors are true, is expected to resemble goggles more closely than a traditional headset.
- Samsung and Google's Android XR headset, Project Moohan, is reportedly set to become available in October of this year, and Google's AI glasses in partnership with Gentle Monster and Warby Parker are also anticipated.
- ByteDance is said to be shifting focus from VR to MR, possibly launching a pair of lightweight goggles, in addition to working on its own pair of AI glasses, according to reports.
Nearly every major tech company is said to have multiple headworn devices on the horizon.
Itâs starting to look like a platform wave is coming together in slow motion. New categories don't just arrive with a single keynote moment. They arrive the way smartphones did, piece by piece, brand by brand, device by device, until one day, smartglasses are just... normal.
Key Takeaway:
More than a dozen headworn devices are expected to launch by 2028 according to various reports. The race for the face is happening now, and the volume of activity suggests smartglasses arenât just coming. Theyâre inevitable.
đĄ Weekly Radar
Your weekly scan across the spatial computing stack.
đ¤ Augmentus Secures $11M to Expand No-Code Robotic Systems
- Augmentus builds no-code software that programs robots to automate welding and surface finishing in factories with constantly changing parts.
- Why this matters: Robotics wonât scale if it requires specialists. Augmentus lowers the barrier with no-code software that makes automation accessible to the teams already on the floor.
𩺠Robot Completes First Autonomous Gallbladder Surgery on Lifelike Model
- A Johns Hopkins robot successfully carried out a full gallbladder removal without human control.
- Why this matters: Spatial computing is rapidly reshaping the operating room, enabling surgeons to perform procedures remotely and access medical data hands-free.
đ Pearson Opens Innovation Hub for AI and Immersive Learning R&D
- Pearson has launched âPearson Labâ to develop scalable learning solutions using generative AI and immersive technologies.
- Why this matters: Pearsonâs new lab brings together researchers and industry partners to explore how emerging technologies can unlock new tools and methods for learning.
âď¸ Lufthansa Cargo Develops Virtual Training for Aircraft Loading Operations
- Lufthansa Cargo is developing a VR training program to help aircraft loading supervisors learn safety and handling procedures.
- Why this matters: VR was designed for training, and Lufthansaâs adoption demonstrates how it can provide a more flexible, cost-effective, and eco-friendly alternative to traditional methods, requiring only a headset and some space.
đ§ TRELLIS Launches in Foundry Labs, Bringing Generative 3D to More Creators
- TRELLIS generates editable 3D models from text or image prompts using Microsoftâs new SLAT format. It is now available to more creators in the Azure Foundry Labs portal.
- Why this matters: Generative 3D is evolving quickly, and tools like TRELLIS remove the biggest barrier, asset creation, helping developers go from idea to prototype faster and potentially secure funding sooner.
đ Meta Brings XR Audio SDK to Wwise, Enhancing Mixed Reality Workflows
- Meta's spatial audio SDK is now available as a plug-in for Wwise, supporting immersive sound design in mixed reality apps.
- Why this matters: Spatial audio is essential for XR, and Metaâs integration with Wwise makes it more accessible while pushing the industry toward richer, more personalized sound experiences.
đˇ RealSense Spins Out from Intel with $50M to Scale Vision AI for Robotics and Biometrics
- RealSense has spun out from Intel and raised $50 million in Series A funding from Intel Capital, MediaTek Innovation Fund, and others.
- Why this matters: RealSense helped pioneer computer vision, and now, with fresh capital and spatial computing gaining steam, itâs poised to scale just as the market catches up.
đ Tobii Earns Approval for Single-Camera Vehicle Sensing Platform
- Tobiiâs single-camera interior sensing platform will go into production with a premium European automaker starting in late 2025.
- Why this matters: Certification in Europe shows that even traditional cars are embracing spatial computing, using sensors to boost safety, comfort, and in-cabin experience.
đŹ Meta Projects Sweep Emerging Media Emmy Nominations
- All three nominees for Outstanding Emerging Media Program were backed by Meta and created for the Meta Quest platform.
- Why this matters: XR recognition at the Emmys marks a big step toward mainstream acceptance, with this yearâs VR and MR nods highlighting a clear win for Metaâs broader content push.
đ Tom's Take
Unfiltered POV from the editor-in-chief.
Every time I see a viral robot video now, I pause and ask: Is this thing actually real?
Not in the CGI sense, weâve mostly moved past that. I mean, is the robot autonomous? Is it truly acting on its own? Or is someone just off-camera with a joystick?
Unitreeâs humanoid robots are the latest to trigger this response. You may have seen one in particular, "Jake the Rizzbot", strutting down the streets of West Village with a pride flag or popping up in Austin wearing a cowboy hat, dancing, waving, and striking up conversations. The clips of Jake are racking up millions of views and flooding my TikTok algorithm. And while the robot appears to be running on its own and responding to questions, whatâs often missing in these videos is the full context. This bot is being remotely controlled.
It reminds me of when Tesla first âunveiledâ its Optimus robot on stageâŚby having a human in a suit do a dance routine. Or when clips of its Optimus robots âbartendingâ at the CyberCab event went viral, only to be revealed that they were being teleoperated.
I get it. This ambiguity is part of the moment weâre in. Weâve spent so long imagining intelligent machines that now, when one actually shows up in the real world, we want to believe itâs the real deal. We want to believe weâre living in the future. But believing too easily risks eroding public trust, especially when the line between demo and deception gets too thin.
The truth is, real progress is already happening. Fully autonomous robots are out in the world, quietly doing real work, from hospital deliveries to heavy machinery. But these staged performances risk blurring the line. When some robots are just wizards behind the curtain and others are the real thing, it gets harder for people to know what to believe. And that disbelief threatens to overshadow the breakthroughs that actually matter.
That all being said, thereâs something undeniably thrilling about watching the fiction become fact, even if someoneâs holding the controller just out of frame.
đŽ Whatâs Next
3 signals pointing to whatâs coming next.
- Wearables are performance amplifiers
Wearable tech isnât just tracking our biometrics. It is actively improving how people perform. FORMâs AR swim goggles coach athletes mid-stroke, helping them adjust in real time. Korean Air is using Hyundai and Kiaâs new X-ble Shoulder wearable robot to reduce strain during aerospace maintenance, easing muscle effort by up to 30%. These accessories are adaptive systems that help humans perform better and train smarter. - Spatial tech is scaling in healthcare
Hospitals and care facilities are moving past pilots into real deployment of spatial technologies. Moxi has now made over 300,000 autonomous pharmacy runs across the U.S., streamlining one of the most tedious tasks in clinical care. Augmedics has crossed 10,000 AR-assisted spine surgeries, with six surgeons at a single hospital using the system daily. And MyndXR is live in 150 senior communities, delivering immersive therapy to over 45,000 residents. The use of these technologies goes beyond research. They are being used for real care, and itâs only just beginning. - Digital twins are becoming real-time decision engines
Simulation is stepping out of the lab and into live operations. UK grocer Morrisons is building a dynamic digital twin of its entire supply chain, letting teams test and adapt logistics without ever touching a box. Foxconn hospitals in Taiwan are using digital twins to train robots, model wards, and optimize patient flows. And a new U.S. center led by U-M and ASU aims to standardize digital twin tech across manufacturing. The vision is clear: a world where every system has a virtual counterpart, always watching, always ready to help make the next move.
đ Youâve unlocked this drop as a Remix Reality Insider. Thanks for helping us decode whatâs next and why it matters.
đŹ Make sure you never miss an issue! If youâre using Gmail, drag this email into your Primary tab so Remix Reality doesnât get lost in Promotions. On mobile, tap the three dots and hit âMove to > Primary.â Thatâs it!
Disclosure: Tom Emrich has previously worked with or holds interests in companies mentioned. His commentary is based solely on public information and reflects his personal views.