Cookies and Privacy

The University uses cookies on this website to provide the best experience possible including delivering personalised content on this website, other websites and social media. By continuing to use the site you agree to this, or your can go to our cookie policy to learn more and manage your settings.

Inclusive Coding for Disabled Developers

This funded project seeks to make coding more accessible for developers with physical impairments.

This funded project provides efficient coding training for disabled developers.

Researchers

Research background

People with physical impairments who are unable to use traditional input devices (i.e. mouse and keyboard) are often excluded from technical professions (e.g. web development).

Alternative input methods such as eye gaze tracking and speech recognition have become more readily available in recent years with both being explored independently to support people with physical impairments in coding activities.

However, there has been a lack of work to date exploring the potential of a multimodal approach (where both interaction approaches are combined) to make coding more accessible.

The project has been funded by Microsoft's AI for Accessibility scheme.

Research aims

We want to explore the potential of combining multiple alternative methods of input (i.e. gaze interaction, speech recognition, and large mechanical switches) to make coding more accessible for people with physical impairments. Our longer term aim is to create a platform that supports disabled people in developing professional levels skills around software/application development, thus presenting new career opportunities. 

How has the research been carried out?

We have built a new development application (“Voiceye”) that combines eye gaze, voice, and mechanical switches as an approach for writing HTML, CSS, and JavaScript code.

The system uses voice input for verbal commands such as selecting, navigating, and removing code – as well as for dictating longer forms of non-code text (e.g. comments). Gaze is used to provide a more controlled approach to write code via an on-screen keyboard – to address issues with slow typing speeds we integrated “Emmet” as a novel approach that enables users to write HTML/CSS code via a shorthand notation.

To evaluate “Voiceye”, we have worked closely with non-disabled and disabled developers in a series of user studies to investigate the potential of this type of approach.

Research outcomes

Our research has resulted in the development of the Voiceye application which is freely available for download. We are actively continuing our work in investigating and extending the capabilities of the application and will be releasing future updates.

More info on the project can be found here.