Apple Introduces Accessibility Features to Control Devices With Eyes and Music Haptics


While Google I/O was underway, Apple managed to steal the show by announcing several accessibility features that will allow people with disabilities to use the device more comfortably. The feature that caught our attention the most was the control of the device using eye movements.

Thanks to the Eye Tracking feature, the iPhone and iPad can be controlled with eye movements tracked by the device's front camera. According to Apple, artificial intelligence is used to follow the user's eyes with a calibration process that can be done in just a few seconds. The eye control system is already supported on Vision Pro but can now finally also be used on iOS phones and iPadOS tablets but not on macOS computers.

Next is Music Haptics which allows the hearing impaired to enjoy music. Some may say this is a strange feature but the fact is that deaf people also want to enjoy music in their own way. The Taptic Engine on the iPhone will vibrate to the tune of music played through Apple Music. At this time this feature will only be provided on the iPhone.

For your information, the Vocal Shortcuts feature can teach the device to understand the user's speech that has changed due to stroke, ALS disease and more. The device is taught to listen to how certain sentences are spoken by the owner to quickly launch the application. This feature is similar to Project Relate (originally Project Euphonia) introduced by Google in 2019.

In addition, Personal Voice now supports Chinese Mandarin for the first time. With this feature, people who lose the ability to speak can record their voices. This voice can then be used in place of a voice in the future. Previously only English was supported.

Apart from that, several more performance features have also been added to Apple Carplay and Vision Pro. Hopefully, these features will arrive in Malaysia soon.

What are your thoughts about this news? Stay tuned for more news and updates like this at TechNave!