📬 Remix Reality Insider: Turning Movement Into Control

📬 Remix Reality Insider: Turning Movement Into Control
Source: Midjourney - generated by AI

Your weekly briefing on the systems, machines, and forces reshaping reality.

🛰️ The Signal

This week’s defining shift.

Motion data is shifting from something systems measure to something they use to guide action.

Instead of treating movement as a record to analyze after the fact, more platforms are turning perceptual motion data into live input for coaching, control, and interaction. This is not about better measurement. It is about using motion data to drive decisions and feedback in real time.

This week’s news surfaced signals like these:

  • U.S. Ski & Snowboard and Google Cloud are testing smartphone-based, markerless motion analysis to turn ordinary video into near real-time coaching guidance for elite athletes.
  • Meta and the University of Utah are studying how surface EMG can translate muscle signals into usable gesture control, including for people with limited hand mobility.

Why this matters: This is the difference between watching a replay and having something help you in the moment. Instead of looking at motion after it happens, these systems are starting to use it while you’re still moving, to help decide what comes next.


đź§  Reality Decoded

Your premium deep dive.

Khronos publishing a release candidate for a glTF Gaussian splatting extension means Gaussian splats are moving from a fast-moving technique into a format the broader 3D ecosystem can rely on.

Gaussian splatting has taken off because it captures real-world scenes with a level of speed and visual richness that traditional meshes struggle to match. But until now, splats have lived in fragmented, tool-specific pipelines. That limits where they can be used and how easily they can move between platforms.

By bringing splats into glTF, Khronos is doing what it has done before with other 3D primitives. It is turning a promising technique into a shared delivery layer.

Three things to know about this move:

  • This is about getting splats to work everywhere: The extension lets Gaussian splats sit in glTF next to meshes and images, which means they can move through the same tools and pipelines people already use instead of living in special-case workflows.
  • It keeps the ecosystem from breaking into a mess of formats: Khronos is stepping in before every company ships its own splat format. If splats are going to show up in maps, digital twins, AR, and simulation, they need to work the same way across platforms.
  • It makes it easier to go from scan to shipping: When splats fit into glTF, you do less custom work to get them into an app or engine. Fewer conversions. Less glue code. Less time lost.
Key Takeaway:
The real win here is boring in a good way. Splats start fitting into existing workflows instead of forcing teams to build around them.

📡 Weekly Radar

Your weekly scan across the spatial computing stack.

PHYSICAL AI

đźš• Waymo Secures $16B to Expand Autonomous Ride-Hailing to 20+ Cities

  • Waymo raised $16 billion in new funding, reaching a post-money valuation of $126 billion.
  • Why this matters: With this funding, Waymo is moving from breakthrough to buildout, with robotaxis heading for the mainstream.

🤖 Humanoid Introduces KinetIQ as Unified AI Framework for Humanoid Robot Fleets

  • Humanoid announced KinetIQ, an internal AI framework designed to orchestrate humanoid robot fleets across industrial, service, and home environments.
  • Why this matters: The four-layer design keeps planning, decision-making, and physical control separate instead of forcing them into one system. That makes the robots easier to manage, easier to update, and less likely to break when the system scales.

🌊 Apeiron Labs Raises $9.5M to Expand Real-Time Ocean Intelligence Network

  • Apeiron Labs secured $9.5 million in Series A funding to scale its ocean data platform powered by proprietary autonomous underwater vehicles.
  • Why this matters: Ocean data has been stuck in the dark ages, slow, sparse, and expensive. Apeiron’s low-cost drones flip the script, making the deep ocean as observable as the sky.
IMMERSIVE INTERFACES

đź‘‹ Meta and University of Utah Explore EMG Wristband for Inclusive Gesture Control

  • Collaboration explores how the Meta Neural Band can support gesture input for users with limited hand mobility.
  • Why this matters: Surface EMG could be a game-changer for people with neuromuscular conditions. Making gesture control work through muscle signals instead of hand movement is a huge step toward real-world inclusion.

🏥 ORamaVR Closes $4.5M to Expand AI-Driven Medical XR Platform

  • ORamaVR raised $4.5 million in late-seed funding led by Big Pi Ventures and Evercurious VC.
  • Why this matters: A platform that measurably improves skill and reduces error could raise the baseline for how healthcare professionals are trained. If it scales, it changes not just how people learn, but how safely they practice.
SIMULATED WORLDS

🌏 Khronos Publishes Release Candidate for glTF Gaussian Splatting Extension

  • Khronos Group published a release candidate glTF extension that enables storage of 3D Gaussian splats in glTF 2.0.
  • Why this matters: This brings Gaussian splatting out of one-off pipelines and into the same delivery layer used by the rest of the 3D ecosystem. If it sticks, it lowers friction between capture, tools, and platforms at a moment when splats are moving from experiments into production.
PERCEPTION SYSTEMS

⛷️ U.S. Ski & Snowboard and Google Cloud Test Smartphone-Based AI Motion Analysis for Training

  • U.S. Ski & Snowboard and Google announced an experimental AI video-analysis tool built on Google Cloud for skiing and snowboarding.
  • Why this matters: Real-time motion analysis captured on a standard smartphone changes when and where coaching data can be used. Instead of relying on lab sessions or post-run review, spatial intelligence and AI are being tested directly on the mountain, during training.

🌀 Tom's Take

Unfiltered POV from the editor-in-chief.

I spent some time with humanoid robots this weekend. It was my first time spending a meaningful amount of time nose distance away from them. The first thing you notice is how much they feel like heavy machinery.

They are powerful. When they walk, they walk with presence. You feel the mass. You feel the momentum. And when one falls, it is not a quiet moment. It is a loud thud. That physical reality does not come across in most demo videos. On screen, these systems still read like software wrapped in a body. In person, they feel much closer to vehicles than to computers.

Being that close to them changes the way you look at the whole thing. They don’t feel like gadgets anymore. They feel like equipment. Big, moving, heavy equipment. The weight and the force are palpable. When you are standing in front of one, it’s hard to think of robots as a software or interface problem. Instead, it gets you thinking about physics, safety, and what happens if one bumps into you, loses balance, or sends something flying.

My time with humanoids helped me realize why consumer robotics companies are spending so much effort trying to make these systems lighter, more agile, and softer. If these robots are going to move into homes, schools, and everyday workplaces, they cannot feel like industrial machines that wandered indoors. They have to feel safe to be around, even when something goes wrong.

From the outside, things like softer shells, padding, rounded edges, and even robot “clothing” can look like marketing. After being around humanoids, it doesn’t feel that way. especially after seeing how they walk, wobble, fall, and get back up. When something has that much weight behind it, those details matter. It’s not about how they look. It’s about whether you feel okay standing next to one.


🔮 What’s Next

3 signals pointing to what’s coming next.

  1. Autonomy is moving from single machines to fleets
    Bedrock Robotics is building systems that run whole construction fleets, not just one machine at a time. Humanoid is doing the same in software with KinetIQ, a single control stack that runs different robots. This is no longer about one smart robot but rather many machines working together.
  2. Autonomy is turning into a service
    Apeiron Labs is selling continuous ocean data, not drones. Waymo is selling autonomous rides, not cars. In both cases, the machines are just the delivery layer. What people are paying for is reliable, always-on autonomous service.
  3. Automakers are turning into robotics companies
    Faraday Future just launched a robotics line with three different machines, two humanoids and one quadruped. This isn’t a concept but rather a real product lineup, with pricing, orders, and deliveries planned. It is yet another example of a car company moving into robotics as a strategic offering.

Know someone who should be following the signal? Send them to remixreality.com to sign up for our free weekly newsletter.

📬 Make sure you never miss an issue! If you’re using Gmail, drag this email into your Primary tab so Remix Reality doesn’t get lost in Promotions. On mobile, tap the three dots and hit “Move to > Primary.” That’s it!

🛠️ This newsletter uses a human-led, AI-assisted workflow, with all final decisions made by editors.