People who are blind and visually impaired use their mobile phones on the go, just like their sighted counterparts. But it’s difficult to do, and even dangerous sometimes. Davide Bolchini, chair of the human-centered computing department in the Indiana University School of Informatics and Computing at IUPUI, is trying to fix that.
His efforts are connected to a research interest he has pursued since he was a graduate student in Switzerland two decades ago: “How can we help blind and visually impaired people navigate the web when they cannot see the screen? How can we make navigation more efficient?”
When Bolchini joined the faculty at the School of Informatics and Computing in 2008, he saw an opportunity to dive even deeper into designing navigation for auditory systems by collaborating with the Indiana School for the Blind and Bosma Enterprises, which helps Hoosiers with vision loss achieve independence.
Over the last 10 years, Bolchini has involved about 40 graduate and undergraduate students in his research, which has been awarded more than $500,000 in external funding. That includes two Google Faculty Research awards totaling $96,079 as well as a $472,311 National Science Foundation grant.
The Google Faculty Research award program funds world-class technical research in computer science, engineering and related fields. The NSF grant funded Bolchini’s proposal to establish advanced design strategies for the aural navigation of complex web information architectures, for which users exclusively or primarily listen to, rather than look at, content and navigational prompts.
“What really interests me in working with the blind and visually impaired is that they have a different perception ability,” Bolchini said. They are able to grasp text-to-speech at a rate that is five or six times faster than the rate of a standard conversation.
“That forced me to look at this from a different viewpoint,” he said. “This, to me, is the essence of research – to look at things from a different angle.”
With his latest Google Faculty Research award, Bolchini is investigating auditory keyboards that use rapid streams of text-to-speech characters, leveraging the exceptional auditory processing abilities of the blind and visually impaired.
Currently, blind and visually impaired mobile phone users type on accessible keyboards that read aloud keys upon touch. But that forces users to keep their phone out at all times, increasing the risk of the device being dropped or stolen. Moreover, users must hold the device with one hand as they use the other to slide their fingers in search for symbols they cannot see. That’s impractical when a blind or visually impaired person is on the move and often is holding onto a cane, a guide dog or another person.
With an auditory keyboard, users would be able to keep their phones in a pocket while letters selectable by a hand or arm motion are read aloud to them through earphones at a very fast rate.
“We are interested in understanding the fundamental language of ‘screenless’ interaction for typing and browsing, an approach that goes beyond voice or touch interfaces and can untether the blind or visually impaired from continuous attachment to a screen while fully leveraging their outstanding ability to listen to extremely fast text-to-speech,” Bolchini said.
Working prototypes of auditory keyboards are being iteratively developed and tested with blind and visually impaired people both at IUPUI and at Bosma Enterprises. A second round of testing is expected to occur within a few weeks.
Bolchini has disclosed the invention to the IU Innovation and Commercialization Office, which protects, markets and licenses intellectual property developed at Indiana University so it can be commercialized by industry.