🔓 Remix Reality Insider: When Smart Homes Start to See

🔓 Remix Reality Insider: When Smart Homes Start to See
Source: Midjourney - generated by AI

Your premium drop on the systems, machines, and forces reshaping reality.

🛰️ The Signal

This week’s defining shift.

What’s next for the smart home is perception, with machines that don’t just connect but understand.

Cameras are becoming the eyes of AI in our homes. Paired with multimodal intelligence, they move beyond simple motion alerts to recognize people, pets, and objects, and respond with real context. From fitness equipment to security systems, everyday devices are starting to see us, understand what we’re doing, and act more like partners in our daily lives.

This week’s news surfaced signals like these:

  • Peloton launched Peloton IQ, an AI coaching system that uses built-in cameras and computer vision to deliver real-time form feedback, rep counting, and personalized strength guidance.
  • Ring introduced new 4K cameras with Familiar Faces and pet detection, turning everyday security devices into assistants that can search for missing dogs or filter out unnecessary alerts.
  • Google's Nest lineup now gives its cameras sharper eyes, able to spot the difference between a car, a delivery person, or the family dog, and generate summaries that actually make sense.

Why this matters: The home is one of the first places where perception systems will scale. AI that can see and interpret physical spaces makes smart devices more useful and personal. This shift marks the transition from connected homes to intelligent homes, where machines no longer just record events but actively help us live.


đź§  Reality Decoded

Your premium deep dive.

Every major shift in computing has had its killer app. For PCs, it was productivity. For smartphones, it was connectivity. For wearables, it’s accessibility.

Meta’s latest Ray-Ban lineup highlights this shift with features like Conversation Focus, which amplifies the voice of the person in front of you while lowering background noise. For anyone in a busy café, this is convenient, but for people who are hard of hearing, it can be transformative. Meta is also pushing forward with live subtitles on its new Ray-Ban Display, and through its partnership with Be My Eyes, the glasses can stream what you’re seeing to a volunteer who provides live guidance.

Apple is making similar moves with its line of wearables. AirPods have evolved into next-generation hearing aids, complete with at-home hearing tests, FDA-cleared features, and real-time sound adjustments that millions can access without stigma. The Apple Watch has tools like fall detection, heart monitoring, and gesture-based controls, providing safety and independence for seniors and those with limited mobility.

Accessibility extends beyond mainstream devices. Exoskeletons from Esko Bionics and SuitX are helping people with spinal cord injuries regain the ability to walk and protecting workers in physically demanding jobs.

Accessibility defines the true value of wearables. These devices don’t just make life easier and more convenient. For many, they expand independence and confidence in daily life.

Key Takeaway:
Accessibility is the killer app for wearables. More than a feature, it’s the reason this category matters.

📡 Weekly Radar

Your weekly scan across the spatial computing stack.

PHYSICAL AI

đźšš Einride Raises New Capital to Accelerate Global Autonomous Freight Push

  • The funding will accelerate the deployment of autonomous freight, expand operations, and advance technology development.
  • Why this matters: This funding reflects how far Einride has come and signals that autonomous freight solutions are here and scaling.

đźš• Zoox Begins Autonomous Vehicle Testing in Washington, D.C.

  • The expansion brings Zoox's active test cities to eight, joining sites like San Francisco, Las Vegas, and Miami.
  • Why this matters: A cross-country push like this signals Zoox’s intent to compete directly with the nation’s AV leaders.

🏙️ Toyota Activates Woven City as Living Lab for Future Mobility

  • The launch kicks off real-world deployment of AVs, logistics platforms, and personal mobility systems.
  • Why this matters: Toyota has built an ecosystem that moves innovation out of the lab and into real life. By pairing inventors with residents, Woven City offers a rare chance to see how emerging technologies actually perform in everyday settings.
IMMERSIVE INTERFACES

đź§  Galea Neon Launches as First Untethered Brain, Body, and Eye Tracking Headset

  • Galea Neon is the first fully standalone headset to combine brain, body, and eye tracking for mobile cognitive and behavioral research and training.
  • Why this matters: OpenBCI has delivered the first standalone system that lets researchers explore the full spectrum of human experience, from physiology to cognition, across both real-world and simulated environments.

đź‘“ SCHOTT Achieves First Serial Production of Reflective Waveguides for AR Glasses

  • The production breakthrough enables lighter AR glasses with clearer displays and longer battery life.
  • Why this matters: Advancements in optics are essential for building AR wearables that people actually want to wear and can wear all day. But it’s not just about making these components, it’s about making them at scale. That’s the milestone SCHOTT is celebrating here.

🇬🇧 YouGov Survey Finds Most UK Adults Uninterested in Buying Smartglasses

  • Just 22% of respondents said they were very or fairly interested in smartglasses.
  • Why this matters: YouGov's survey shows that while the tech may be getting more ready for the mainstream, there is much to be done to educate and inspire everyday consumers on why they should adopt them.
SIMULATED WORLDS

🌍 Hawk-Eye Volumetric Data Powers Football Manager’s Next-Gen Animations

  • Football Manager 26 uses Hawk-Eye’s skeletal tracking data to create more realistic player movement based on real match footage.
  • Why this matters: Hawk-Eye is showing how powerful volumetric tracking can be. Its data can be used for improving performance and on-screen analysis as well as elevating simulations and gameplay.
PERCEPTION SYSTEMS

🤖 Google Unveils Vision-to-Action AI Models to Power Next-Gen Robots

  • Google introduced Gemini Robotics 1.5 (VLA) and Gemini Robotics-ER 1.5 (VLM) to combine high-level reasoning with vision-guided physical action in real-world robotic tasks.
  • Why this matters: Google’s latest release focuses on foundational models that aim to let robots of all types act without much human intervention or hard-coding.
SOCIETY & CULTURE

đź’‰ One-Time VR Session Found to Reduce Needle Anxiety in Adults

  • A peer-reviewed study using XRHealth software found that a single virtual reality session led to statistically significant reductions in needle-related anxiety.
  • Why this matters: VR has longstanding roots in mental health, and this study highlights the power of simulation to help people prepare for situations that trigger anxiety.

🌀 Tom's Take

Unfiltered POV from the editor-in-chief.

The Friend ad campaign in New York is one of the boldest out-of-home campaigns I’ve seen in a while.

Friend is a $129 wearable AI device pitched as a companion you can talk to. The round pendant is an always-on assistant that feels a lot like something from the movie "Her", just without the voice of Scarlett Johansson.

Friend’s $1M ad campaign was a billboard, bus shelter, and subway takeover across New York, built on stark white posters packed with text. The ads defined “friend” like a dictionary noun, as "someone who listens, responds, and supports you," along with dedicated signs of what Friend promises, like I'll never leave dirty dishes in the sink" or "I'll never bail on dinner plans." Each billboard left ample white space, as if asking people passing by to contribute.

Covering stations with these minimalist statements and deliberately leaving blank space for people to write on was a genius move. And an intentional one, as CEO Avi Schiffmann told ADWEEK. It wasn't long after the ad campaign went live before it became a canvas for written commentary. Friend knew most of the comments would be negative, but that’s the point. The backlash creates its own energy. When people write “surveillance capitalism” or “get real friends,” it doesn’t just criticize the product but creates polarization around it. This debate causes consumers to pick a side, and those who lean the other way or are on the fence are more likely to be curious, check it out, and even buy it in, perhaps just to prove it's not as bad as what others are saying.

But what really struck me is what these scribbles revealed about the concerns people have today around AI and the growing ubiquity of sensors. The comments showed a deep unease with AI, a worry that machines are stepping into roles we see belonging to humans, and the sense that everything being recorded can be used against us, and a fear of being watched all the time.

This campaign is not only good marketing, but it also acts as a mirror. And what it reflects back is a society wrestling with where AI belongs in our lives, our relationships, and our private spaces. The more these devices become part of us, on our faces, in our ears, and all around our homes, the more urgent these questions become.


🔮 What’s Next

3 signals pointing to what’s coming next.

  1. Robots are reinventing food delivery
    Food delivery is being transformed as sidewalk robots move from pilots to citywide rollouts. Serve Robotics has expanded into Chicago through Uber Eats, deploying sidewalk bots across 14 neighborhoods, while DoorDash unveiled Dot, its in-house robot built for local trips on bike lanes, sidewalks, and driveways. What began as app-based convenience is now being reinvented as fleets of robots quietly redraw how food moves through cities.
  2. The robot brain wars are heating up
    NVIDIA and Google are racing to define the intelligence layer for robotics. NVIDIA launched an open stack with its Newton physics engine, GR00T reasoning model, and Cosmos world foundation models to train and test robots at scale, while Google unveiled new Gemini Robotics models that link vision, reasoning, and action to help robots act with less human input. Both moves show the competition shifting from building individual machines to building the shared “brains” that could power them all.
  3. Retail is being reshaped by spatial computing
    From immersive marketing to automated operations, spatial computing is changing how retail works. In the front of the house, BOSS and EPAM are using Apple Vision Pro to bring Formula 1 into stores, turning shopping into an interactive experience that blurs branding and entertainment. Meanwhile, in the back of the house, Seven-Eleven Japan is partnering with Telexistence to roll out humanoid robots powered by vision-language-action models that can restock shelves and handle store tasks across thousands of locations by 2029. These stories show how spatial computing is drawing customers deeper into the experience while making operations smarter and more autonomous.

🔓 You’ve unlocked this drop as a Remix Reality Insider. Thanks for helping us decode what’s next and why it matters.

📬 Make sure you never miss an issue! If you’re using Gmail, drag this email into your Primary tab so Remix Reality doesn’t get lost in Promotions. On mobile, tap the three dots and hit “Move to > Primary.” That’s it!