Ahead of Global Accessibility Awareness Day on May 18, Apple on Tuesday previewed software features for cognitive, vision, hearing, and mobility accessibility, along with tools for individuals who are non-speaking or at risk of losing their ability to speak.
"Today, we're excited to share incredible new features that build on our long history of making technology accessible, so that everyone has the opportunity to create, communicate, and do what they love," said Tim Cook, chief executive officer (CEO) at Apple.
"These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways," said Sarah Herrlinger, senior director of Global Accessibility Policy and Initiatives at Apple.
Below are the details of new features
With Live Speech on iPhone, iPad, and Mac, users can type what they want to say and have it be spoken out loud during phone and FaceTime calls as well as in-person conversations. The users can also save commonly used phrases. This feature is primarily aimed for those who are unable to speak or who have lost their speech over time.
iPhone 15 series: Apple's upcoming phones to be made in India by Tata Group
Apple iPhone 16 Pro models may get bigger displays, Periscope camera lenses
Apple opens its first official store in Mumbai; what we know so far
Apple's Delhi store now open for public: 5 things you need to know about it
Apple BKC Mumbai store previewed ahead of April 18 opening: See pictures
Smaller firms can still compete in AI depending on product mkt fit: Nadella
Google to delete all personal accounts that have been inactive for 2 years
Carefully examining risks involved in using ChatGPT, in healthcare: WHO
Govt launches Sanchar Saathi, AI-based portal to detect telecom frauds
Apple unveils tools for cognitive, vision accessibility in its products
Here, the users can create a personal voice by reading along with randomly generated text prompts to record 15 minutes of audio on iPhone and iPad. Apple said, this feature uses on-device machine learning to keep users' information private and secure and integrates with the live speech feature. It is designed for users who are suffering from conditions that can progressively impact speaking ability such as those with a recent diagnosis of amyotrophic lateral sclerosis (ALS).
Point and speak-in magnifier
The point and speak-in magnifier feature would make it easier for users with vision disabilities to interact with physical objects that have several text labels. For example, while using a household appliance — such as a microwave — the feature can combine input from the camera app, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their fingers across the keypad.
Point and speak feature is built into the magnifier app on iPhone and iPad and works with the voiceover app. It can be also used with other magnifier features such as people detection, door detection, and image descriptions to help users navigate their physical environment.
The assistive access feature uses innovations in design to distil apps and experiences to their essential features to lighten the cognitive load. It offers a distinct interface with high-contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individuals.
This includes a customised experience for the phone and Facetime apps, which have been combined into a single calls app, as well as messages, cameras, photos, and music.
For example, for users who prefer communicating visually, the messages app includes an emoji-only keyboard and the option to record a video message to share with others. The users can also choose between a more visual, grid-based layout for their home screen and apps or a row-based layout for users who prefer text.
The deaf or hard-of-hearing users can pair Apple hearing devices directly to Mac and customise them for their hearing.
Moreover, the voice control feature adds phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound alike.
Users with physical and motor disabilities who use the switch control feature to turn any switch into a virtual game controller to play their favourite games on iPhone and iPad.