Accessibility ensures everyone has equal access to resources, tools, and opportunities. In the world of technology, strides are being made every day to improve the accessibility of digital tools and services. Artificial intelligence is playing a massive role in how to make things more equitable and meet accessibility needs.

There are permanent, temporary, and situational disabilities that require specific accessibility needs. Someone is in a wheelchair may have permanent or temporary accessibility needs. But even something less drastic — let's say someone twists their ankle while playing soccer, or happens to be colorblind and can't see important labels on a website — these too, could be considered disabilities.

Ensuring we design our technology to be more accessible doesn’t just improve the experience for a select few, it improves experiences for everyone. Millions of people are affected by all kinds of disabilities, and many of us have experienced a situational disability at one point or another. Perhaps you’ve just had a baby and need to push the stroller and need to find a different path than taking the stairs — that would constitute a situational disability.

Stephen Hawking is a great example of the power of AI: a brilliant scientist who was largely immobile in terms of muscle movement, Hawking used a switch in his cheek to increase his typing speed. The switch came with predictive text, which is an AI feature that helped him type his messages faster, much like our smartphones are equipped to do today with AutoCorrect and SwiftText.

 What are some other common use cases of AI for accessibility?

Live captions

Live captions are especially helpful for the deaf, hard-of-hearing and neurodivergent, who learn best through visuals. They’re also useful for the modern-day TV watcher who's always multi-tasking. Adding captions makes it easier to pay attention to what’s going on and makes it more accessible.

Content reading

There are a lot of ways AI can be built into content creation to help accessibility, including image recognition, making visual information available to those with visual accessibility needs. AI voice-over text provides an auditory experience which helps the visually impaired or neurodivergent who struggle with reading.

Tools like BeMyEyes use AI to read text, recognize faces, and identify objects in their surroundings. JAWS (Job Access With Speech) is another great tool that reads the screen aloud to facilitate access to digital information for the visually impaired.

Translation services

Translation services are especially helpful for anyone looking to understand video, text or sound that has been created in a language that is different from theirs. It’s even built into daily tools like PowerPoint now (with a subtitle feature that automatically translates text). 

Voice control and input

From Alexa and Siri to voiceovers and speech recognition, the sky is the limit when it comes to AI’s ability to create accessible digital experiences.

Voice control and input can be crucial for those with motor impairments, allowing their voice to dictate their computer’s movements. Even Stephen Hawking incorporated this technology into his home office so he could communicate virtually. However, voice control is also great for situational moments when your hands are tied and allows you to access the technologies without any touch or body movements.

Dragon NaturallySpeaking is a speech recognition software that allows you to use your computer hands-free, and converts the spoken word into written text so that anyone with dexterity challenges can control their devices.

Eventually, AI will be able to adapt its tools to suit our individual needs in a one-stop shop. We're getting close: digital interfaces can now be built into custom design specs that focus on the individual’s needs and preferences. There is no one-size-fits-all way to present information to someone with or without an accessibility need, because every impairment can vary from person to person and change depending on various extenuating circumstances from the time of day to a person's mood.

By incorporating AI tools like larger font sizes for those with dyslexia or a modified color palette for those with color blindness, there’s a way to personalize technology to suit your needs, preferences, and desires. The Accessibility Insights for Web tool uses AI to analyze the web for low-performing accessibility (think missing alt text for images, low contrast tech and improper forms). Google Assistant is also a personalized tool that considers auditory input and responds with personalized answers. It can be customized for hands-free use.

That's not to say that AI is perfect. There are still inaccuracies and they are known to have ableist biases and designers should be careful not to follow the same patterns. But from cognitive support like setting reminders, organizing tasks, and prompts reminding you of important to-do’s, to more tactile support, with things like screen readers and built-in image recognition, there are many ways the technology can be built for good.

The link has been copied!