Inclusive Coding for Disabled Developers

This funded project seeks to make coding more accessible for developers with physical impairments.

This funded project provides efficient coding training for disabled developers.

People with physical impairments who may experience challenges in using traditional input devices (i.e., mousekeyboard, and touch) are often excluded from technical professions (e.g., software engineering). Alternative input methods such as eye gaze tracking and speech recognition have become more readily available in recent years, although there has been a lack of work exploring the potential of using these technologies to make coding more accessible. 

To address this gap, we are exploring the potential of combining multiple alternative methods of input (i.e. gaze interaction, speech recognition, and large mechanical switches) to make coding more accessible for developers with physical impairments. This work has resulted in a new development platform (Voiceye) that facilitates multimodal inputas an approach for writing HTML, CSS, and JavaScript code (further details are available in the following paper: Voiceye: A Multimodal Inclusive Development Environment).

We have also been investigating the working practices of voice coders with physical impairments, as well as the strengths and limitations of different multimodal speech coding approachesOur paper on voice coding experiences for developers with physical impairments provides additional detail around the work we have undertaken in this area. 

Our longer-term aim is to create a customisable development platform that supports people with a range of impairments (using different input modalities) to write and manage code efficiently and effectively to a professional standard. We are also particularly interested in examining further the potential for intelligent coding assistants to support disabled developers and how this can support coding experiences. 

Project Team

Funders

This work has been funded through Microsoft AI for Accessibility and Google Inclusive Research Program awards. 

Contact

For more information on the project, contact Professor Chris Creed (chris.creed@bcu.ac.uk).