Hearing is seeing - AI for better UX accessibility
Microsoft launches AI powered solutions for visually impaired people creating new possibilities to better interpret the world around them, and to access online services and information.
There are lots of things happening around AI. But so far it has mostly been large scale industrial and commercial automation, and we haven’t really seen much in the way of UX development. (Well, excepts some awful marketing chat bots.) One of the most valuable examples of how to use AI is in improving accessibility.
A really life-changing idea for AI implementation in service development is the ”SeeingAI”, Microsoft’s app for the visually impaired. In a demo earlier this year it was introduced by Jenny Lay-Flurrie, Microsoft’s Chief Accessibility Officer, who herself deaf, the app was demonstrated by a visually impaired solution architect.
Seeing AI works by describing people, objects, and even text to "narrate the world around you." The app leverages AI to recognise friends and describe their emotions. With strangers, it can describe their gender, estimated age, and what they're wearing. It will also recognise and read short text snippets and full documents and read them back to you.
Like many tools powered by AI, it’s not perfect, but will learn and improve over time the more data it has access to and the more it is used.