How Apple’s AI-powered ‘Eye Tracking’ feature help users with physical disabilities?

thedigitalfit.com
2 Min Read

Apple has recently announced a new accessibility feature called “Eye Tracking”, that will provide a way for users with physical disabilities to control iPad or iPhone with their eyes.

Powered by artificial intelligence (AI), the Eye Tracking feature will give users a built-in option for navigating iPad and iPhone with just their eyes.

Designed for users with physical disabilities, the feature uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on the device, and isn’t shared with Apple, the company explained.

Eye Tracking works across iPadOS and iOS apps and doesn’t require additional hardware or accessories. The tech giant will roll out the feature later this year.

The feature will also allow users to navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

Along with the Eye Tracking feature, the company also introduced several other updates for users who are blind or have low vision.

A feature called — ‘VoiceOver’ will include new voices, a flexible Voice Rotor, custom volume control, and the ability to customise VoiceOver keyboard shortcuts on Mac. For users with low vision, ‘Hover Typing’ will show larger text when typing in a text field, and in a user’s preferred font and colour.

Users with physical disabilities will be able to control their device by using a resizable trackpad on a small region of the screen with a ‘Virtual Trackpad’ for AssistiveTouch.

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *