🔓 Remix Reality Insider: Smarter Spaces, Safer Systems

🔓 Remix Reality Insider: Smarter Spaces, Safer Systems
Source: Midjourney - generated by AI

Your premium drop on the systems, machines, and forces reshaping reality.

🛰️ The Signal

This week’s defining shift.

Perception is becoming infrastructure.

Machine perception is becoming a core part of the technology stack for real-world systems. From vehicles to buildings to logistics hubs, sensing and spatial awareness are now built into the tools that support transportation, safety, and automation. These capabilities are being treated as standard components, not optional layers.

This week’s spatial computing news surfaced signals like these:

  • Subaru is using HPE Cray servers to train AI models for its next-generation EyeSight system. The goal is to improve how cars detect vehicles, pedestrians, and road conditions in real-world driving.
  • Butlr introduced a wired thermal sensor that tracks people’s presence in buildings without using cameras. The sensor helps optimize workplace layouts, staffing, and energy use.
  • AEye launched a lidar system called OPTIS that adds real-time 3D perception to places like airports and logistics centers. The system is designed to work with existing infrastructure and supports third-party AI tools.

Why this matters: Perception technologies are becoming part of the systems we rely on every day. From traffic safety to workplace design to automated logistics, sensing and spatial awareness are becoming foundational. This shift from dumb spaces to smart spaces is happening.


đź§  Reality Decoded

Your premium deep dive.

One of the promises of spatial computing is to make the world a safer place.

By combining sensors with AI, machines are starting to understand their environment with more accuracy and context. They don’t just see a pedestrian, they detect movement, gauge distance, and predict intent. They don’t just recognize a car, they track its speed, path, and proximity. This deeper understanding makes it possible to avoid collisions, respond faster to danger, and reduce human error before it turns deadly.

We’re seeing this in action, especially in the automotive and transport industries. Waymo’s autonomous vehicles are logging millions of real-world miles with remarkably low collision rates, especially with pedestrians, cyclists, and motorcycles. These results are part of Waymo's Vision Zero mission to eliminate all traffic fatalities and severe injuries.

Subaru is pushing to eliminate all traffic fatalities by 2030. Its next-generation EyeSight system is being trained on massive video datasets and accelerated by HPE’s AI hardware, so it can better recognize complex situations, like intersections, school zones, and sudden stops, and act with split-second precision.

In the commercial trucking world, Aeva and Bendix are integrating 4D LiDAR to improve perception for nighttime driving, emergency braking, and blind-spot detection. These sensors deliver both range and velocity, helping heavy-duty vehicles react to fast-moving threats that cameras and radar can miss.

These systems are already reshaping how we think about safety. They’re being trained to notice the things we miss. They work at night. They don’t get distracted. And when designed right, they act with the speed and precision that humans can’t match.

Key Takeaway:
Perception systems are making safety proactive. With smarter sensing and faster response, we’re building a world where machines can anticipate scenarios and help prevent harm faster and better than we humans can.

📡 Weekly Radar

Your weekly scan across the spatial computing stack.

PHYSICAL AI

🍔 White Castle Debuts Coco Robot Deliveries on Uber Eats in Chicago

  • White Castle has started robotic food delivery in Chicago using Coco Robotics on the Uber Eats app.
  • Why this matters: White Castle joins Subway, Wingstop, and Jack in the Box among fast-food chains turning to robotic delivery. By plugging into existing on-demand systems, Coco’s bots can roll out quickly and start moving meals with minimal disruption.

đźš— Waymo Names Dallas as Next City for Autonomous Ride-Hailing Launch

  • Waymo will launch its fully autonomous ride-hailing service in Dallas in 2026, through a new partnership with Avis Budget Group.
  • Why this matters: Waymo is making its way fast across major cities in the USA. Its partnership with Avis could help accelerate its availability even more quickly by leveraging the auto rental group's decades-long fleet management experience.

🛒 HomeBase Pilots Simbe’s Shelf-Scanning Robot at Two Locations

  • HomeBase USA has deployed Simbe Robotics’ Tally robot in stores in Copperas Cove, TX and Laramie, WY.
  • Why this matters: Robots like Tally are collaborating with staff, taking on the manual repetitive tasks to free human workers up to take care of customer relations. When machines handle the shelf scans, humans can focus on helping customers.

📦 Pudu Debuts T600 Series for Heavy Industrial Robot Delivery

  • Pudu Robotics has introduced the T600 Series, built to handle 600kg loads in fast-moving factory and warehouse environments.
  • Why this matters: Pudu is drawing a major line in the sand with its on-prem autonomy. If cloud-free robots like these prove reliable at scale, it could reshape how factories think about automation, trust, and infrastructure.
IMMERSIVE INTERFACES

đź‘“ Brilliant Labs Launches AI Glasses With Display That Sees, Hears, and Remembers

  • Halo features a micro OLED display, an AI-powered sensor stack, and 14-hour battery life in a lightweight frame.
  • Why this matters: Brilliant Labs has taken a bold step in open-sourcing not just the software but the hardware. With an OLED display and AI-ready sensors, the ingredients are all there to explore the future of wearables that gain context and understanding of the real world.

📺 LIMINAL Space Joins Disney Accelerator 2025 With Holographic Display Tech

  • LIMINAL’s LED screens show 3D holograms to large groups without requiring XR headsets.
  • Why this matters: The Disney Accelerator has long invested in technologies that are transforming media. Past cohorts have seen Disney support AR contact lenses, location-based VR, programmable robots, and more.
PERCEPTION SYSTEMS

đźš§ ThirdEye and Xvisio Unite AR Platform and SLAM Tech for Enterprise Use

  • ThirdEye will combine its no-code AI platform and smart glasses with Xvisio’s SLAM and 3D hand tracking tools.
  • Why this matters: When companies bring core technologies together, the result is stronger, more complete systems. Customers get a setup that’s easier to deploy and simpler to support. That kind of integration makes a real difference in the field.

đź“· Swift Raises $50M to Expand Precision Navigation for Vehicles and Robots

  • Swift’s Skylark platform delivers cloud-based centimeter-level accuracy to over 10 million connected systems.
  • Why this matters: By boosting GPS accuracy from meters to centimeters, Swift delivers the 100x leap in autonomy that robotics depends on. This is critical in environments where even small errors can cause big failures and precision isn’t optional.

🌀 Tom's Take

Unfiltered POV from the editor-in-chief.

I still can't believe we're living in a time where having your own humanoid robot is actually becoming a reality.

The fact that you can get a Unitree R1 for $5,900 just blows my mind. Sure, it’s more of a developer playground, an entry-level robot that can do some impressive tricks and gives developers the tools to push the limits with a new kind of platform. But what’s even more thrilling are the robots that go beyond demo mode. The robots being prepped to deliver actual service through real autonomy, manipulation, and AI-driven understanding.

I nearly spilled my coffee when I saw the CEO of Figure casually post a video of its F.02 robot doing his laundry on social media. I mean, it was putting clothing in the washing machine, all on its own!

It got me thinking. What’s the killer app for humanoid robots?

I’ve been asking nearly everyone I know: if you had a humanoid robot, what’s the one thing you’d want it to do? Most people hesitate at first, probably because we’ve all been burned by robot vacuums that kind of just work. They don't believe robots like this are around the corner, as they still see robots as a thing seen only in the movies. But once they get past all of this, the answers come fast.

Overwhelmingly, the #1 thing people want is help around the house. Laundry (especially folding and putting it away) topped the list. Doing the dishes, cleaning the bathroom, and taking out the trash were all close seconds. Pretty much all the tedious, mundane tasks we all hate doing around the house are top picks. Other popular choices? Walking the dog and grocery shopping.

And that tracks with what the broader research is showing. A 2025 YouGov survey found that while only 38% of Americans say they’re interested in having a household robot, a huge majority were very clear about what they’d want help with: 93% picked floor cleaning, 87% picked dishwashing, and 86% picked laundry and home organization. So even if people are still warming up to the idea of a robot in the home, the demand for help with boring, repetitive chores is loud and clear. Turns out, we don’t want C-3PO, we want Rosey from the Jetsons.

What’s exciting is that with the AI systems powering these robots, we won’t need to wait for an app to be released for laundry or dog walking, like we did with smartphones. That’s because robots aren’t going to work like phones. There’s no need to download task-specific apps. There’s just the brain. Apps are being replaced by general-purpose intelligence.

Figure F.02, for example, runs on Helix. Helix is Figure's AI system that lets robots see, understand, and do things just by using plain language. You say “pick up the red shirt,” and boom, it’s done. No need to develop or download a task-specific app. It figures it out on the fly, even if it’s never seen that exact shirt before, as long as it’s the kind of everyday object it’s been trained to handle. Systems like this give robots common sense, muscle memory, and intuition. And so, there is no killer app for robots. There is just a killer brain.

So, what would you teach your robot first? And what task would you gladly hand over for good?


🔮 What’s Next

3 signals pointing to what’s coming next.

  1. Robots are taking root on the farm
    Agriculture is facing labor shortages, rising costs, and the need to produce more with less. Robots are starting to fill the gap, not in labs, but in fields and greenhouses. Unitree’s Go2 is being adapted to help young farmers monitor crops using AI and edge computing. It’s affordable, rugged, and designed to be used by non-experts. 4AG Robotics just raised $40 million to scale its mushroom harvesting bots, which trim and pack produce without human labor. These systems integrate with existing infrastructure and are moving toward broader deployment. Combined, these moves show that farming is becoming a key proving ground for real-world robotics, with autonomy helping to reshape who can farm and how.
  2. The line has been drawn for spatial computers
    Meta’s 20–40 minute “Goldilocks” guidance for VR sessions quietly sets a boundary for what headsets are capable of today. While positioned by Meta as advice for developers, it also acts as a clear marker of where the industry stands and what still needs to be solved. This ceiling defines how long most people are comfortable in headsets before fatigue kicks in or the battery runs dry. To cross that line, hardware will need to improve comfort, battery life, and performance. This transition will move MR headsets from their current casual, short-form use as a next-gen game console to a spatial computer used regularly.
  3. Physical AI is starting to pay off
    Recent earnings reports show that AI-powered systems in the physical world are turning into viable revenue streams. WeRide’s robotaxi revenue surged nearly ninefold in Q2 2025, now making up over a third of its total sales. EssilorLuxottica reported a 200%+ year-over-year jump in Ray-Ban Meta sales, while also expanding its smart eyewear lineup and scaling Nuance Audio to 10,000 retail locations. These results point to rising demand for devices that can sense, see, and act in real time, a clear sign that physical AI is making its way to the mainstream market.

🔓 You’ve unlocked this drop as a Remix Reality Insider. Thanks for helping us decode what’s next and why it matters.

📬 Make sure you never miss an issue! If you’re using Gmail, drag this email into your Primary tab so Remix Reality doesn’t get lost in Promotions. On mobile, tap the three dots and hit “Move to > Primary.” That’s it!