This funded project seeks to make coding more accessible for developers with physical impairments.
People with physical impairments who are unable to use traditional input devices (i.e. mouse and keyboard) are often excluded from technical professions (e.g. web development).
Alternative input methods such as eye gaze tracking and speech recognition have become more readily available in recent years with both being explored independently to support people with physical impairments in coding activities.
However, there has been a lack of work to date exploring the potential of a multimodal approach (where both interaction approaches are combined) to make coding more accessible.
The project has been funded by Microsoft's AI for Accessibility scheme.
We want to explore the potential of combining multiple alternative methods of input (i.e. gaze interaction, speech recognition, and large mechanical switches) to make coding more accessible for people with physical impairments. Our longer term aim is to create a platform that supports disabled people in developing professional levels skills around software/application development, thus presenting new career opportunities.
How has the research been carried out?
The system uses voice input for verbal commands such as selecting, navigating, and removing code – as well as for dictating longer forms of non-code text (e.g. comments). Gaze is used to provide a more controlled approach to write code via an on-screen keyboard – to address issues with slow typing speeds we integrated “Emmet” as a novel approach that enables users to write HTML/CSS code via a shorthand notation.
To evaluate “Voiceye”, we have worked closely with non-disabled and disabled developers in a series of user studies to investigate the potential of this type of approach.
Our research has resulted in the development of the Voiceye application which is freely available for download. We are actively continuing our work in investigating and extending the capabilities of the application and will be releasing future updates.
More info on the project can be found here.