📬 Remix Reality Weekly: Deeper Than Display

📬 Remix Reality Weekly: Deeper Than Display
Source: Midjourney - generated by AI

Your free Friday drop of spatial computing updates—plus what Remix Reality Insiders unlocked this week.

🛰️ The Signal

This week’s defining shift.

3D is no longer just for gaming or immersive experiences. It’s being used to generate content for today’s devices, including phones, laptops, and flat screens, as well as to train the systems that will power tomorrow’s devices, from autonomous robots to AR glasses.

What started as visual content is now becoming infrastructure for media production, simulation, and machine learning.

👉 Get the full insight in this week’s Insider drop.


📡 Weekly Radar

Your weekly scan across the spatial computing stack.

PHYSICAL AI

🚗 Waymo Report Shows 88% Fewer Serious Crashes Than Human Drivers

  • New data show that Waymo’s autonomous vehicles were involved in 88% fewer crashes resulting in serious injury or worse compared to human drivers in the same locations.

🕶️ Meta Reportedly Invests $3.5 Billion in EssilorLuxottica Amid Smartglasses Push

  • Meta has reportedly taken a minority stake of under 3% in EssilorLuxottica, with the investment valued at around €3 billion, or $3.5 billion according to a report by Bloomberg.
IMMERSIVE INTERFACES

👁️ CREAL Secures $8.9M From ZEISS to Scale Light Field Displays for AR and Vision Care

  • CREAL's $8.9 million funding round was led by ZEISS and will support the miniaturization of products and their integration into ZEISS diagnostic tools and AR glasses.

🔥 Nokia Shares Progress on Conductivity-Based Thermal Haptics for XR Touch

  • Nokia detailed its latest research into heat-based touch systems that let users feel temperature differences, like hot and cold, in virtual environments.
SIMULATED WORLDS

🧑‍💻 Intangible Launches Open Beta for Spatial AI Creation Platform for Visual Storytelling

  • Intangible’s browser-based 3D tool is now in open beta, giving creative teams a faster way to build visual ideas across film, events, advertising, and games.

👤 Applied Intuition Acquires Reblika to Simulate Human-Vehicle Interaction

  • Applied Intuition has acquired Reblika’s technology to create highly detailed, animated digital humans for simulation.
PERCEPTION SYSTEMS

🧠 Formant Platform Brings Generative AI to Robotics Operations

  • Formant's F3 platform uses natural language and agentic reasoning to control robots, surface insights, and automate decisions.
SOCIETY & CULTURE

🎭 AR App Translates Indigenous Stories Into Location-Based Experiences

  • A research team at the University of Sydney has created an AR experience that presents Indigenous narratives using sound and images tied to real-world sites.

🌀 Tom's Take

Unfiltered POV from the editor-in-chief.

One day, we’ll look back at how we use the word immersive today for XR, and it’ll sound adorably outdated.

Just like we once called CD-ROMs “interactive” or referred to 28.8 modems as “high-speed,” our current definition of immersion in XR feels advanced for its time, but it’s far from fully realized.

Right now, immersion mostly means sight and sound. High-fidelity displays, spatial audio, and six degrees of freedom let us enter digital content in ways we never could before. But we’re still only scratching the surface. We haven’t yet brought all of our senses into the experience, at least not in a way that mirrors how we engage with the physical world.

Nokia’s latest research into thermal haptics, letting you feel heat and cold in XR, is a glimpse of where things are headed. Seeing something isn’t the same as feeling it. That’s where things get real. When your body can tell the difference between virtual wood and virtual metal based on how each absorbs heat, that’s a whole new layer of fidelity. And it doesn't stop at touch. Scent and taste are also areas of research that feel experimental now, but will eventually reach commercialization.

We’ll truly earn the right to use the word immersive when XR engages all of our senses. For now, it works. But when we can no longer tell the difference between the virtual world and the real one, that’s when we’ll know we’ve arrived.


🔒 What Insiders Got This Week

This week’s Insider drop included:

  • 🧠 Reality Decoded: What OpenAI needs to consider for its rumored spatially-aware third core device.
  • 🔮 What’s Next: Robots are the new Raspberry Pi; displays are evolving for the human eyes; and XR and AI are reshaping the operating room.

👉 Unlock the full drop → Upgrade to Insider


🚀 Thanks for being a Remix Reality subscriber!

Know someone who should be following the signal? Send them to remixreality.com to sign up for our free weekly newsletter.

📬 Make sure you never miss an issue! If you’re using Gmail, drag this email into your Primary tab so Remix Reality doesn’t get lost in Promotions. On mobile, tap the three dots and hit “Move to > Primary.” That’s it!