-
Father of drowned teen urges secondary school swimming lessons - 9 mins ago
-
College Football Week 12 Best Upset Picks: BYU on Upset Alert vs. Kansas? - 17 mins ago
-
Gilberto Ramirez vs. Chris Billam-Smith : Live Updates & Results - 19 mins ago
-
Peru signs key FTA with Hong Kong — MercoPress - 21 mins ago
-
South Yorkshire dog owners warned after 13 attacks in 48 hours - 22 mins ago
-
Malcolm X’s family sues FBI, CIA and NYPD over his murder - 29 mins ago
-
Redondo Beach man arrested in connection with suspicious bag found at courthouse - 30 mins ago
-
Matt Gaetz Faces ‘Problems Upon Problems’ During Nomination: Ex-Prosecutor - 31 mins ago
-
MLS awards winners, finalists: Lionel Messi, Luis Suarez up for MVP; Wilfried Nancy to finally win top coach? - 34 mins ago
-
Sheinbaum not happy with Moddy’s lowering Mexico’s grading — MercoPress - 36 mins ago
Apple Announces New AI and ML Powered Eye Tracking and Music Haptics Accessibility Features
Apple announced several new accessibility-focused features for its iPhone and iPad devices on Wednesday. The company regularly introduces new accessibility features to make the devices easier to use for those who suffer from physical disabilities. This year, the tech giant is bringing a new Eye Tracking feature that will allow users to control their device with just eye movements. Additionally, Music Haptics will let users experience music via vibrations, and Vocal Shortcuts will let them perform tasks with custom sounds.
The features were announced via a post on the company’s newsroom. Apple’s senior director of Global Accessibility Policy and Initiatives, Sarah Herrlinger said, “Each year, we break new ground when it comes to accessibility. These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”
First, the Eye Tracking feature offers users a built-in option for operating iPhone and iPad with just eye movements. Powered by artificial intelligence (AI), this feature uses the front camera which can be calibrated with the user’s eyes and the on-device machine learning (ML) features track the eyes to let people with physical disabilities easily navigate through the phone. The company says that it does not have the access to the user data.
Music Haptics is another new feature that offers a unique way for users to experience music who suffer from hearing impairments. The feature, on iPhone, leverages the Taptic Engine to play taps, textures and vibrations to match the audio of the music. Apple says the feature can play millions of songs in the Apple Music catalogue. It will also be available as an API for developers to integrate it in their music apps.
Next, Vocal Shortcuts for iPhone and iPad users is designed to help people suffering from speech-related disabilities. It allows users to set custom utterances that can be understood by Siri to launch shortcuts and complete tasks. Further, a new feature dubbed Vehicle Motion Cues adds animated dots on the edges of the screen to reduce sensory conflict between what a person sees and feels. Citing research, Apple said this conflict is one of the main reasons behind motion sickness, and the feature can reduce such symptoms.
Apart from these, CarPlay is also getting voice control, sound recognition, and colour filters to help users with various disabilities. Apple’s newest product line, Vision Pro is also getting a system-wide live caption feature for those with hearing difficulties.
Affiliate links may be automatically generated – see our ethics statement for details.
Source link