Practice with 5 powerful accessibility features in iOS 17

If you haven’t looked into any of Apple’s accessibility features because you aren’t blind or deaf and don’t think they would make your life easier, you might be surprised.

Apple has built a handful of accessibility features into iOS 17 that allow people with various disabilities to use the iPhone in new and unexpected ways. However, absolutely anyone can take advantage of these tools, which prove to be surprisingly useful in certain situations.

You can already get live subtitles to watch videos silently, lock your phone in one app to stop people from snooping, play soothing ocean or forest sounds, and more.

In iOS 17, five accessibility features take things even further. Assistive Access simplifies your phone to its bare functions to make it easier to use; Live Speech and Personal Voice let you type on the keyboard to speak using your voice; Tracking mode and Point and Speak help you navigate using the iPhone camera.

Our hands-on demo will show you what these features can do for you.

Practice with 5 new accessibility features in iOS 17

Apple previewed these five new accessibility features earlier this year for Global Accessibility Awareness Day. Now that we’re digging into the iOS 17 developer betas, we finally have a chance to see how they work.

See all these features in action in this video:

Please note that Apple has not yet released iOS 17 to the public. Since this is a beta release of the software, things may change as Apple engineers tweak iOS 17 before its final release.

You can get the latest developer beta by turning on beta software updates in Settings. Be warned, though: beta software is unstable, may drain your battery faster, and carries the risk of data loss. We can expect a public release of iOS 17 this September if you can wait.

#1: assisted access

Assistive Access Home screen and camera
Assistive Access is a pretty foolproof interface that you can switch to.
Screenshot: D. Griffin Jones/Cult of Mac

The Assistive Access feature simplifies some of the standard iPhone apps down to their most basic functions. Limit your cognitive load by keeping loved ones in touch with a modern device.

I remember watching TV ads for cell phones like the Jitterbug flip phones marketed to older users that came with huge numbers and limited, easy-to-use features. Assisted Access looks like a re-imagining of that for 2023.

Messaging and NetNewsWire running in Assistive Access
A simple iMessage interface, and all your regular apps too.
Screenshot: D. Griffin Jones/Cult of Mac

If someone you know would be overwhelmed by the full capabilities of a smartphone, you can turn on Assisted Access to make using an iPhone easier. Transform the user interface of the iPhone, with large buttons for all the main functions like Messages, Calls and Camera.

With this feature turned on, taking pictures, making calls and sending messages becomes much easier. And you can still enable all the regular apps from the App Store if you need to install something else, like a health tracking app.

#2: Live Speech

Live Speech running on iPhone
You never know when you might get sick and lose your voice. Getting familiar with this feature might help you in a pinch.
Screenshot: D. Griffin Jones/Cult of Mac

If you’re losing your voice, Live Speech gives you the ability to turn text into speech, in person or on a call. To find the feature, go to Settings > Accessibility > Live Speech (downward). Turn on Live Speech and tap Favorite phrases to create some shortcuts for phrases you often say.

Later, you can turn on Live Speech by triple-clicking the iPhone side button or from the Accessibility link in Control Center.

Then just type using the keyboard in the popup and press Send for your iPhone to speak aloud. For longer sentences, it will highlight word for word how he is speaking. FaucetFavorite phrases to access shortcuts.

No. 3: Personal voice

Set up Personal Voice on iPhone.  One of the example sentences is Did you write anything else?
Setting it prompts you to read a series of random, non-sequitur sentences, like this sentence I never, ever hear from my editors.
Screenshot: D. Griffin Jones/Cult of Mac

The Personal Voice feature builds on Live Speech allowing you to use your own voice. You just need to take some time to read a bunch of sentences aloud and your phone will be able to recreate your voice afterwards. Apple designed Personal Voice for users at risk of losing the ability to speak such as those with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other conditions that can progressively impact speaking ability.

However, as with any accessibility feature, anyone can use it. To do this, go to Settings > Accessibility > My voice and tapCreate a personal entry. You will need to find a quiet space and hold the phone about 15cm from your face while talking on the phone. This can take 15 minutes or up to an hour, according to Apple.

Once you’re done setting up Personal Voice, you need to let your iPhone stay plugged in for a while while it churns out recordings and creates a synthesized recreation of your digital voice. Then, you will see Personal Voice as an option under Live Speech.

In practice, I can definitely tell that he’s using recordings of my voice. Sounds a bit like me, but doesn’t sound as emotive and expressive as some of the newer advanced voices for Siri. Listen for yourself.

#4: Discovery Mode

Open the Magnifier app and tap the Tracking button.
The Magnifier app is a great Swiss army knife of accessibility features.
Screenshot: D. Griffin Jones/Cult of Mac

With discovery mode, you can use your iPhone’s camera to identify things around you like people, doors, appliances, and other objects. It’s available within the Apple Magnifier app, which you can download for free from the App Store if it’s not already on your iPhone.

To use discovery mode, open the Magnifier app and tap the Detection button on the right it looks like a square. On the left, press the icons to activate the tracking features:

Port detection and image descriptions
Let the power of your iPhone’s camera help you look around you.
Screenshot: D. Griffin Jones/Cult of Mac

  • People detection it will show you how close you are to someone else.
  • Door detection it will tell you how close you are to a door, how to open it, what is written on it, and various other attributes. Haptic feedback and clicking noises get faster and louder the closer you get.
  • Image descriptions it will tell you what your camera is pointing at using object detection. This has mixed results for me if you live in a house with hardwood floors be prepared for Wood Processed to stay on screen the entire time. Sometimes the feature crashed during my tests, not updating for a while as I pointed it around the room. But when it works, it works really well.

No. 5: Point and speak

Using Point and Speak on an oven.
Point to and be read to tiny low-contrast text.
Screenshot: D. Griffin Jones/Cult of Mac

Point and Speak works in tandem with the survey mode. In discovery mode, tap Point and speak the icon below. Then you can reach out, point at something, and have your iPhone read it to you.

If you find the markings on your oven dial hard to see, for example, your iPhone can read you the numbers so you don’t burn down your roast or your house. Your hand should be held out quite clearly in front of you, but Point and Speak works well enough in my limited tests.

More features and coverage of iOS 17

Stay tuned as we continue to cover new features coming to iOS 17:


#Practice #powerful #accessibility #features #iOS
Image Source : www.cultofmac.com

Leave a Comment