Apple Introduces AI Patches for Eye and Voice Control on iPhone and iPad

Apple introduces voice and eye commands for controlling iPhones and iPads, improving accessibility by removing the need for additional equipment.

Utilizes the front-facing camera to track and calibrate eye movements. Allows navigation through eye focus on different screen areas.

Includes "Dwell Control" which activates screen components where the user's gaze rests.

Introduces Vocal Shortcuts for hands-free control, simplifying setup compared to the existing manual Shortcuts system.

Features the "Listen for Atypical Speech" setting to accommodate speech variations from users with conditions like cerebral palsy, stroke, or ALS.

Transforms audio into tactile feedback through the iPhone's Taptic Engine, designed for users who are hard of hearing or deaf.

Aids users prone to motion sickness by aligning on-screen motion with vehicle movement, reducing sensory conflict.

– Includes Sound Recognition for alerting users to sirens and car horns. – Features Voice Control for easier app navigation and operation.

These updates reflect Apple's ongoing commitment to inclusive design and using technology to enhance lives, as stated by CEO Tim Cook.

Stay Updated With Us!!