1X’s Redwood AI Lets Home Robots Move, Grasp, and Act on Spoken Goals

- Redwood is a compact AI model that lets NEO Gamma navigate homes, manipulate unfamiliar objects, and respond to voice-derived commands.
- It runs entirely onboard and coordinates full-body movement for bracing, bending, and carrying while in motion.
Redwood is a new AI model from 1X that powers NEO Gamma, a home robot built to move, manipulate, and assist in real-world living spaces. Running entirely onboard, Redwood enables NEO to perform tasks like retrieving objects, opening doors, and navigating cluttered rooms, without depending on cloud servers or carefully arranged environments.
Redwood is among the first vision-language architectures to jointly control walking and manipulation. In home settings, manipulation often requires more than lifting items off tables; people bend at the legs, hips, and spine to reach the floor, or press into doors to open them. These whole-body control tasks make it impossible to separate locomotion from manipulation cleanly. Coordinating all parts of the body also enables multi-contact behaviors, such as bracing a hand against a wall when pulling open heavy doors. Redwood allows NEO to perform such tasks fluidly, even with new objects in places it hasn’t seen before.
Redwood uses voice-based instructions that are converted into commands, based on thousands of training examples. It learns from successful and failed attempts and is built on real-world data from diverse environments. It empowers NEO Gamma to adapt safely and intelligently in dynamic homes, running on hardware designed for compliance, safety, and resilience.
Source: YouTube/1X
🌀 Tom’s Take:
Redwood presents a major opportunity for humanoid robots to act more human. As household chores emerge as a key application, the ability to learn from both success and failure, and to coordinate walking with manipulation, enables the full-body actions required for real home tasks.
Source: 1X