U.S. Ski & Snowboard and Google Cloud Test Smartphone-Based AI Motion Analysis for Training
- U.S. Ski & Snowboard and Google announced an experimental AI video-analysis tool built on Google Cloud for skiing and snowboarding.
- The system uses standard smartphone video to generate near-real-time motion data without wearable sensors.
U.S. Ski & Snowboard and Google announced a joint effort to build an AI video-analysis tool on Google Cloud aimed at elite ski and snowboard athletes. The tool is being developed for world-class competitors, including U.S. Olympians, with a focus on improving training precision and safety. The collaboration uses on-mountain video analysis alongside training that has traditionally relied on visual observation or laboratory-based motion capture.
"Our collaboration with U.S. Ski & Snowboard is the blueprint for a global shift in how humans move, train, and recover, moving beyond historical data to provide athletes with near real-time, prescriptive coaching," said Oliver Parker, vice president, Global Generative AI, Google Cloud, in a press release. "By using our full-stack AI, we're helping democratize elite coaching—proving that if we can solve for the world's best athletes in the most extreme conditions, we can help anyone from a physical therapy patient to an amateur golfer improve their games."
To develop the system, Google Cloud engineers worked with the Stifel U.S. Freeski Team and Hydro Flask U.S. Snowboard Team during on-mountain training in Austria and Colorado, targeting conditions where traditional motion-capture systems often fail. Coaches recorded athlete runs using standard smartphones from the sidelines or the bottom of a run and uploaded the video to a dashboard for processing. The tool uses markerless motion capture powered by spatial intelligence from Google DeepMind research to track body movement through winter clothing and equipment, without wearable sensors, and delivers near real-time motion analysis. Running on Google Cloud and supported by Gemini, the system lets coaches and athletes interact with motion data using natural language and stores each session for analysis over time.
Athletes and coaches are continuing to prototype the AI tool in training, with sessions stored in a centralized database for ongoing analysis. The work is continuing ahead of the Olympic Winter Games.
🌀 Tom’s Take:
Real-time motion analysis captured on a standard smartphone changes when and where coaching data can be used. Instead of relying on lab sessions or post-run review, spatial intelligence and AI are being tested directly on the mountain, during training.
Source: PR Newswire / Google Cloud