Apple To Roll Out New Accessibility Features
Apple is set to launch a range of new capabilities aimed at making life easier for people with disabilities including an option to control a phone or tablet using only a person’s eyes.
The technology giant said this week that it will introduce a built-in eye-tracking option to its iPad and iPhone later this year.
Using the front-facing camera, users will be able to set up and calibrate their device within seconds to navigate apps, swipe, gesture and more, solely by focusing with their eyes. The functionality, which relies on artificial intelligence, does not require any additional hardware or accessories. Apple said all data will be stored on the device and not shared with the company.
Advertisement - Continue Reading Below
In addition to eye tracking, iPhone and iPad will also include more options for those with speech issues. With Vocal Shortcuts, users will be able to tell Siri to launch shortcuts or complete various tasks using custom sounds. Meanwhile, a feature called Listen for Atypical Speech uses machine learning to understand a wider range of speech patterns resulting from conditions like cerebral palsy or stroke.
Apple is also adding the option for live captioning to Facetime and other applications, offering those who are deaf or hard of hearing the ability to experience music through taps, textures and refined vibrations, among other accessibility enhancements.
“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, CEO of Apple. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”
Read more stories like this one. Sign up for Disability Scoop's free email newsletter to get the latest developmental disability news sent straight to your inbox.