🔓 Remix Reality Insider: The Age of Upgradable Reality
Your premium drop on the systems, machines, and forces reshaping reality.
🛰️ The Signal
This week’s defining shift.
Software is turning installed hardware into new machines.
A growing share of progress in spatial computing is no longer coming from new devices, but from new software unlocking capabilities inside hardware people already own. Performance, reach, and behavior are being rewritten after deployment.
This week’s news surfaced signals like these:
- Bigscreen Beyond 2e received Dynamic Foveated Rendering in early access, using built-in eye tracking to concentrate rendering where the user is looking. Without changing the headset, Bigscreen delivered smoother frame rates, higher visual quality, and lower GPU load.
- Rivian expanded hands-free assisted driving from 135,000 to over 3.5 million miles across the U.S. and Canada with a single software update, extending autonomy beyond highways and introducing new AI driving modes that let drivers choose how the system accelerates, follows, and changes lanes.
- Meta’s AI Glasses gained new perception and interaction modes in its latest software update, now available to early access users in Canada and the U.S., including voice amplification in noisy environments and multimodal Spotify playback driven by what the user sees.
Why this matters: Hardware used to set the ceiling. Now the ceiling moves. Through software, devices gain new abilities after they ship. The real product is no longer the device itself, but what it becomes over time.
đź§ Reality Decoded
Your premium deep dive.
This week, we zoom in on quantum dots, one of the most important and least understood technologies shaping the future of XR displays.
Quantum dots are tiny semiconductor crystals that emit bright, highly saturated light when energized. Their color depends on their size, which lets manufacturers tune them with nanometer-level precision. For display makers, it means stronger color, brighter images, and better efficiency than what conventional display materials usually offer.
For phones and TVs, quantum dots already matter. For XR, they are mission-critical.
Putting a display on your face is unforgiving. It needs high brightness, low power use, and pixel density that holds up at close range. Very few display systems can meet all three.
This week, researchers at Pusan National University cleared one of the biggest remaining obstacles. They demonstrated a scalable method for building quantum dot displays above 10,000 pixels per inch without degrading brightness or color performance. Their process avoids the damage that normally occurs during high-resolution patterning, while improving energy efficiency and nearly tripling display lifetime.
Three ideas matter here.
- Resolution is the gatekeeper for wearable XR: Smartglasses and VR headsets live inches from the eye. Anything below extreme pixel density breaks immersion and causes eye strain. The Pusan team’s approach finally meets the physical limits XR demands.
- Color and brightness determine realism: If the colors are off or the display cannot get bright enough, the illusion breaks. Quantum dots solve both. That is why they matter for daylight AR, convincing VR, and wearables that actually work outside.
- Manufacturability unlocks the market: The tech only matters if it can be built at scale. This process fits into existing manufacturing and supports full-color displays, which is what finally moves quantum dots from the lab into real consumer devices.
Key Takeaway:
Displays are a bottleneck for XR. Quantum dots are one of the few technologies capable of breaking it. When you combine extreme resolution, brightness, efficiency, and manufacturability, you are no longer improving screens. You are enabling an entirely new class of wearable computers.
📡 Weekly Radar
Your weekly scan across the spatial computing stack.
🤖 Boston Dynamics’ Atlas to Make First Public Appearance at CES 2026
- Hyundai Motor Group will unveil its AI Robotics Strategy at CES 2026, and Boston Dynamics will bring the new Atlas robot out of the lab and onto the stage for the first time.
- Why this matters: Atlas stepping onto a public CES stage marks a clear shift from controlled lab demos to real-world scrutiny. All eyes will be on Boston Dynamics’ latest robot as it moves into the spotlight.
🏠MyCo Robots Extend Comau’s Collaborative Push in Manufacturing
- The six-model MyCo series offers reach spans from 590 mm to 1300 mm and payload capacities between 3 kg and 15 kg.
- Why this matters: In manufacturing, speed to reconfigure is just as critical as speed to deploy. MyCo’s modular build, fast programming tools, and task-ready form factor are built for both.
đź’° Galbot Raises $300M, Reaches $3B Valuation with Global Investor Backing
- Galbot secured over $300 million in new funding, bringing total funding to $800 million and valuing the company at $3 billion.
- Why this matters: Investors are backing Galbot’s full-stack control and real-world adoption, signaling growing confidence in embodied AI at an industrial scale.
🪡 TARS Robot Crosses 'Impossible to Automate' Threshold with Embroidery Demo
- TARS Robotics publicly demonstrated a humanoid robot completing hand embroidery with both hands in real time.
- Why this matters: Beyond embroidery, this breakthrough is all about robotic control. TARS proves that when you close the loop between sensing, learning, and movement, robots can finally take on the delicate, flexible work that once seemed untouchable.
🎧 Meta AI Glasses Update Adds Voice Boost and Multimodal Spotify Playback
- New software feature amplifies voices to improve conversations in noisy environments.
- Why this matters: The new Spotify feature turns what you see into inspiration for a personalized soundtrack. A strong example of the power of multimodal AI to bring the physical world into a digital experience.
🕶️ Bigscreen Brings Dynamic Foveated Rendering to Beyond 2e in Early Access
- Bigscreen has released Dynamic Foveated Rendering in early access for the Beyond 2e headset through the Bigscreen Beyond Utility.
- Why this matters: This update gives Beyond headset users more performance without the need for new hardware. Games can run smoother, look better, and stay comfortable longer, which is exactly what most VR needs more of.
đź’„ Banuba Boosts Face AR SDK with Precision Backgrounds and Facial Contour Mapping
- Banuba has enhanced its Face AR SDK with cleaner virtual backgrounds and a new face shape detection module.
- Why this matters: If your segmentation’s messy or your face data’s off, the whole AR experience falls apart. Clean edges and accurate detection are the baseline for anything immersive.
🧑‍🚀 XPANCEO Demonstrates Space-Compatible Smart Contact Lens
- XPANCEO has unveiled a smart contact lens prototype for space, integrated with a working space suit to provide hands-free visual data directly to the eye.
- Why this matters: Smart contact lenses make sense in places where smart glasses don’t, like racetracks, spacewalks, or anywhere bulky hardware gets in the way. XPANCEO’s use case of putting AR in your eye, not on your face, aims to solve this.
🍎 Apple’s SHARP Turns a Single Image into a 3D Scene in Under a Second
- SHARP creates a detailed 3D scene from one photo using a fast neural network that runs in real time on standard hardware.
- Why this matters: SHARP shows how Apple could move beyond stereo capture toward full 3D scene reconstruction, potentially unlocking spatial photos from any image, not just ones taken with depth-enabled cameras.
🎥 mimic-video Uses Pretrained Video Models to Improve Robot Learning Efficiency by 10x
- mimic-video is a video-action model that reduces data needs for robot learning.
- Why this matters: mimic-video uses video models to simplify robot learning, showing that visual prediction can reduce the need for large teleoperation datasets.
🌀 Tom's Take
Unfiltered POV from the editor-in-chief.
One of the first killer applications of XR is emerging around something very human, bringing families together when they cannot be in the same place. As the holidays arrive, we especially become more aware of distance. XR offers a way to collapse that distance by making presence feel shared again, even when travel is impossible.
The technology is already coming together to make this a reality. The big headset players are all working on ways to create hyper-realistic digital twins of their users, including Apple’s Personas and Meta’s Codec Avatars. And we are also seeing headset-free solutions like Google Beam’s 3D communication platform changing what it means to talk to someone at a distance. When a person shows up in your space as a lifelike avatar, the moment stops feeling like a video call and starts feeling like you are sharing the same room. This makes the experience feel like they are visiting rather than dialing in, and immediately enables all participants to be more present with one another.
XR does more than shrink distance. It also changes how we experience time. With spatial video and volumetric capture, moments stop living in the past and start feeling like places you can return to. As AI becomes part of those recordings, the past begins to feel closer to the present. Companies like 2wai offer an early look at how people may one day see and speak with loved ones who are no longer here. This changes how people grieve, remember, and stay connected across generations.
The holidays are the one time of year when people really try to show up for each other. When distance or time gets in the way, XR can offer a new way to stay close. That is when the technology feels less like hardware and more like something meaningful.
🔮 What’s Next
3 signals pointing to what’s coming next.
- Industrial robotics shifts toward flexible, human-centered systems
Galbot’s $300M raise reflects rising confidence in humanoid robots designed for general-purpose work across factories, warehouses, hospitals, and retail. At the same time, Comau’s MyCo cobots show how factory automation is being rebuilt around fast deployment, modular design, and close human collaboration. Both point to an industrial transition away from rigid, single-task machines and toward adaptable systems built for constant change. - Networks are the hidden engine of XR
SoftBank, Ericsson, and Qualcomm ran a live trial in Tokyo using 5G Advanced to stream XR content to smartglasses and smartphones. They found that this cut wireless latency by about 90%. Using tools like L4S, network slicing, and improved scheduling, they delivered stable, real-time performance on a commercial network. This is the level of network performance needed for XR to be more usable, reliable, and consistent in everyday conditions. - Input moves closer to the body
Mudra’s latest update adds ready-made control presets and lets users complete setup and configuration directly on supported smartglasses, without needing a phone or PC. As EMG input integrates into wearable platforms, control shifts off external devices and closer to the body. This makes interaction with spatial systems more direct and practical.
🔓 You’ve unlocked this drop as a Remix Reality Insider. Thanks for helping us decode what’s next and why it matters.
📬 Make sure you never miss an issue! If you’re using Gmail, drag this email into your Primary tab so Remix Reality doesn’t get lost in Promotions. On mobile, tap the three dots and hit “Move to > Primary.” That’s it!
🛠️ This newsletter uses a human-led, AI-assisted workflow, with all final decisions made by editors.