Apple announced that it will make available new important “Accessibility functions“At iOS 17 that will be released this year, and two of the most important are “Live Speech” and “personal voice“. Both functions can help them significantly users that face problems with speech.
The personal voice, on the other hand, is an Apple feature that helps people who have difficulty speaking or are at risk of losing their voice them due to diseases such as Amyotrophic lateral sclerosis (ALS). With this function, they can create and save a voice that looks like their own.
This voice feature is then integrated with Live Speech, so users can speak with their Personal Voice on calls FaceTime and during conversations.
Additionally, Apple is introducing streamlined versions of its core apps as part of a feature called Assistive Access meant to support users with cognitive disabilities. The feature is designed to “distill apps and experiences to their essential features in order to lighten cognitive load.”
There’s also a new detection mode in Magnifier to help users who are blind or have low vision, which is designed to help users interact with physical objects with numerous text labels.
However,these features have the potential to break down barriers, facilitate communication, and empower individuals with diverse needs.