Article 6BRT0 Apple details upcoming AI-driven iOS 17 accessibility features

Apple details upcoming AI-driven iOS 17 accessibility features

by
Samuel Axon
from Ars Technica - All content on (#6BRT0)
  • Apple-accessibility-iPad-iPhone-14-Pro-M

    Apple plans to roll out new accessibility features to its devices in iOS 17. [credit: Apple ]

Around this time last year, Apple previewed several of the accessibility features that would be added to iOS 16, which launched last fall. Now it seems that has become a tradition; today, Apple published details about several upcoming features in iOS 17 that are meant to help users with speech, vision, and cognitive disabilities use the company's devices more effectively.

For example, nonspeaking people will be able to type and have that translated into synthesized speech on a call. Many of the new features like this rely on machine learning. In another example, "those at risk of losing their ability to speak can use Personal Voice to create a synthesized voice that sounds like them for connecting with family and friends," Apple writes.

Other features are purely design-oriented. For users with cognitive disabilities, Apple will roll out Assistive Access, which redesigns apps like Photos, Camera, or Music to reduce cognitive load and make them easier to use. There will also be ways to focus communication on visual media, like recording short videos.

Read 3 remaining paragraphs | Comments

External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments