Meta's Aria Gen 2 Unpacks Advanced Sensor Array for Machine Perception Research

Meta's Aria Gen 2 Unpacks Advanced Sensor Array for Machine Perception Research
Source: Meta
  • Meta’s new Aria Gen 2 glasses feature upgraded HDR cameras, expanded stereo vision, and real-time on-device AI for spatial tracking.
  • The wearable adds contact audio sensors, heart rate detection, and precise time-syncing across devices to support robust research data collection.

Aria Gen 2, Meta's latest research glasses, delivers a major hardware leap over its 2020 predecessor. The updated design improves wearability with folding arms, eight size variants, and a lightweight frame suited to extended research use.

The device includes four global shutter CV cameras with 120 dB HDR and 80° stereo overlap, designed to support 3D hand tracking, spatial mapping, and foundation model training. A calibrated ambient light sensor controls camera exposure settings, a contact mic records audio in wind-heavy settings, and a nosepad-based PPG sensor tracks the wearer’s heart rate. Each component is intended to capture a distinct layer of real-world signal for multimodal AI research.

Aria Gen 2 also performs on-device machine perception tasks using Meta’s custom coprocessor. These include 6DOF tracking via visual-inertial odometry, multi-metric eye tracking, and 3D hand pose detection. SubGHz radio hardware enables sub-millisecond device-to-device synchronization, allowing researchers to align multimodal datasets across space and time.

Applications to work with Aria Gen 2 will open later this year. Meta continues supporting Aria Gen 1 through the Aria Research Kit, which is now available for rolling applications. Researchers can join the Aria Gen 2 interest list to stay informed as the platform evolves.


🌀 Tom’s Take:

Aria showcases an interesting use for wearables, one not aimed at assisting humans directly but at capturing the human experience for machines. This type of training is essential for improving machine perception and enabling progress toward more capable systems like humanoid robotics.


Disclosure: Tom Emrich has previously worked with or holds interests in companies mentioned. His commentary is based solely on public information and reflects his personal views.

Source: Meta