Apple Adds Live Speech and AI-Based Voice Generation on iOS 17

Apple is now previewing several iOS 17 features that are focused on enhancing speech, vision, and cognitive accessibility on iPhones and iPads. Among of the improvements include are text-to-speech and AI-based personal voice generation, which are expected to be rolled out along with Assistive Access later this year.

Apple continues to build on existing accessibility features and introducing new capabilities at the same time. Both Live Speech and Assistive Access are entirely new functions while Point and Speak is an update to Apple’s built-in Magnifier app that also received Door Detection on iOS 16 last year.

Live Speech and voice generation comes to iOS 17

The Live Speech is the iPhone-maker’s version of text-to-speech for unable to speak but integrated to FaceTime and regular phone calls. Apple says users can type in text and the feature reads it aloud to the person or group they’re on call with. Apple says it will arrive on iPhones, iPad, Macs, and MacBooks.

iOS 17 Adds Live Speech and personal voice generation via AI
iOS 17’s Live Speech and Personal Voice Advanced Speech features / © Apple, Edit by NextPit

Meanwhile, the other speech-related functionality is Personal Voice, which relies on machine learning on the device. It lets users create or synthesize their voice through reading prompted texts for 15 minutes. This helps those who are at risk of losing their voice due to conditions like ALS or amyotrophic lateral sclerosis. The technology behind resembles to Amazon’s AI voice generation that was announced last year.

Assistive Access simplifies Apple apps experience

Apple’s new Assistive Access provides customized apps experience. Rather than expanding controls and input options to apps, the feature simplifies the use of these apps for seniors and people with cognitive and visual disabilities by presenting enlarged buttons and texts. It supports apps like the main camera, messages, and phone as well as Apple Music and Photos.

Apple intros Assistive Access to iOS 17
iOS 17’s Assistive Access shows a simplified interface for several iPhone and iPad apps. / © Apple, Edit by NextPit

Magnifier gets Point and Speak

In addition to detecting people and doors and reading out image descriptions, the Magnifier on iPhones can now identify more objects via Point and Speaker, which should aid more visually impaired individuals. It works with short labels that are found in appliances. For example, a user can point at a button on a microwave and the app will subsequently read out the text.

There are also updates and added features related to accessibility. For those who have motor disabilities, Apple’s Switch Control will turn a switch into a virtual game controller. Furthermore, Apple said that hearing aid accessories made for iPhones will now be compatible with Mac devices. Both Voice Control and Voice Over via Siri are getting phonetic suggestions and speed customizations, respectively.

Apple didn’t confirm if all of these improvements will debut on iOS 17 or separately as a minor update within the next software iteration. Likewise, we want to know which do features are the most helpful? 

Source

      Guidantech
      Logo
      Shopping cart