Apple makes iOS more accessible with new updates

Apple makes iOS more accessible with new updates

KUALA LUMPUR, May 22 — Global Accessibility Day (GAAD) happens on May 18 every year, focusing on inclusivity for the disabled in the digital space as well as spreading awareness.

As the disabled activists I know frequently say: Designing for the disabled means designing for everyone.

If it’s accessible to the disabled that means it is for every single person and it’s always great to see how accessibility in tech is highlighted each year.

Apple has made accessibility a core feature in its software design and the next iOS update continues in that vein.

A challenge for low vision users is dealing with all the various icons on a screen and so the upcoming iOS 17 makes that easier with Assistive Access.

Instead of struggling with icons, apps and experiences are instead minimised to a highly simplified interface with larger buttons and labels.

Users can even choose whether they prefer text or a visual grid. The Calls app for instance will combine Photos, Music, Messages and Camera for easy access.

An interesting new feature is Personal Voice, where you can create a digital voice that sounds like you.

It takes just 15 minutes, no further training needed to recognise and then create your own individual digital voice. You just need to read from a set of text prompts to record the audio for creation.

While the feature is for those with a risk of losing the ability to speak, such as people with degenerative diseases, I could see the feature being used by people with chronic anxiety or speech impediments.

Personally I have benefited from vision accessibility on iOS — my ageing eyes means I often need to magnify text on my screen so I do use the Magnifier app frequently.

It’s a lot less embarrassing to tap my screen with three fingers to enlarge items on my display then bring my phone up to my face (though I still do that sometimes).

With the update, the Magnifier app will include Point and Speak — it uses the Camera app, LiDAR scanner and on-device processing to identify physical objects with text labels, making them easier to interact with.

Point your phone at, say, a microwave and the Magnifier app will read out the text on the buttons.

Seeing as my own microwave has only icons instead of text buttons, I guess I won’t be using my phone to help me not squint, but perhaps I will stick text labels on it in preparation for the inevitable deterioration of my eyes.

These and many other new features will be coming to iOS later this year so I guess I’ll have plenty of time to put labels on things for convenience or just to hear my phone say out loud “dog” when I put a sticker on my mutt’s forehead.

You can learn more about Apple’s accessibility features across its platforms at https://www.apple.com/my/accessibility/