Apple Brings Live Captions and Visual Assistance to Vision Pro and Apple Watch

- Vision Pro gains camera magnification, VoiceOver support, and a new API enabling visual assistance apps.
- Apple Watch introduces Live Captions integrated with iPhone’s Live Listen feature for users who are deaf or hard of hearing.
Apple has announced a set of powerful accessibility features coming to its Vision Pro headset and Apple Watch. Vision Pro will now support systemwide magnification using its front-facing camera, allowing users to zoom in on their surroundings without lifting the device. VoiceOver is also being added, providing spoken feedback for objects, documents, and on-screen content.
Source: Apple Newsroom
Apple is introducing a new developer API for Vision Pro that allows apps to access the headset’s camera for building visual assistance experiences. This opens the door for real-time support tools such as peer-to-peer navigation or object recognition applications.
On Apple Watch, Live Captions are being introduced for the first time. When paired with an iPhone using Live Listen, the watch can display real-time captions of the surrounding audio. Users can also control Live Listen sessions directly from the watch, making it easier to follow conversations or media in environments like meetings or classrooms.
🌀 Tom's Take:
Wearable technology lends itself well to advancing accessibility, given its access to more of our body's inputs. While immersive interfaces have proven to be great entertainment and productivity devices, the most powerful feature may just be making it easier for everyone to interact with the world on their own terms.
Source: Apple