​​AR4AV: Towards Accessible Pedestrian-Autonomous Vehicle Interaction for Sensory-Disabled People Using Augmented Reality​

Doctoral Training Grant Funding Information 

This funding model includes a 36 month fully funded PhD Studentship, set in-line with UK Research & Innovation values. For 2025/6, this will be £20,780 per year. The tax-free stipend will be paid monthly. This PhD Studentship also includes a Full-Time Fee Scholarship for up to 3 years. The funding is subject to your continued registration on the research degree, making satisfactory progression within your PhD, as well as attendance on and successful completion of the Postgraduate Certificate in Research Practice.    

All applicants will receive the same stipend irrespective of fee status. 

Application Closing Date: 


Midday (UK Time) on Wednesday 17th September 2025 for a start date of 2nd February 2026.  

How to Apply 

To apply, please follow the below steps:  

  1. Complete the BCU Online Application Form. 
  2. Complete the Doctoral Studentship Proposal Form in full, ensuring that you quote the project ID. You will be required to upload your proposal in place of a personal statement on the BCU online application form.  
  3. Upload two references to your online application form (at least one of which must be an academic reference). 
  4. Upload your qualification(s) for entry onto the research degree programme. This will be Bachelor/Master’s certificate(s) and transcript(s). 
  5. International applicants must also provide a valid English language qualification. Please see the list of English language qualifications accepted here. Please check the individual research degree course page for the required scores. 

Frequently Asked Questions 

To help support you to complete your application, please consult the frequently asked questions below: 

Project title: AR4AV: Towards Accessible Pedestrian-Autonomous Vehicle Interaction for Sensory-Disabled People Using Augmented Reality 

Project Lead: Dr Wenge Xu

Project ID: 15 - 45391463 

Project description:

Road traffic injuries are a leading cause of death and disability worldwide. Road traffic crashes are responsible for 50 million people injured and 1.2 million deaths each year, with pedestrians accounting for 21% (i.e., 250k)1 of the deaths. Road traffic crashes create enormous social costs for individuals, families and communities and burden economies and health systems.

Autonomous vehicles (AV) hold promises to play a key role in the urban environment, interacting with other road users safely and smoothly, preventing crashes, limiting injury, and ultimately reducing the number of deaths [1]. However, the absence of human drivers in such technology poses significant risks for pedestrians as it fundamentally changes the existing road communication system, i.e., pedestrian-human driver communication to pedestrian-AV communication [2]. Despite the growing research in using the external human-machine interface (eHMI) to compensate for the lack of human driver cues, current mainstream on-vehicle eHMIs cannot address the needs of all sensory-disabled people due to potential risk of sensory overloading and the inability to deliver haptics feedback [3]. The inaccessible pedestrian-AV communication would cause confusion and hesitation in decision-making, putting sensory-disabled people in dangerous situations and accidents, even fatalities. 

With recent advancements in hardware and software, augmented reality (AR) has received significant attention as assistive technology and can unlock the potential of disabled people [4]. This PGR research project aims to develop wearable AR solutions to remove accessibility barriers for sensory-disabled people in pedestrian-AV communication. This project addresses gaps that could benefit the 14 million sensory-disabled UK population [GOV.UK], preventing them from being in dangerous situations in future road traffic with AVs. Furthermore, this work aims to improve the accessibility of future transport, which is a legal obligation in countries like the UK (Equality Act 2010) and within the EU (European Accessibility Act). Ultimately, this project benefits assistive technologies, researchers, the automotive industry, sensory-disabled people, and standards and regulations authorities, contributing to an equal and inclusive society. 

Following our recent work funded by the Royal Society on Inclusive External Human-Machine Interface for Automated Vehicles-d/Deaf and Hard of Hearing (DHH) Pedestrian, this project will employ participatory design method to develop wearable AR solutions to assist communication between sensory-disabled people and AVs. Specifically, this project will: 

  1. Co-design wearable AR solutions for pedestrian-AV communication with sensory-disabled people and other relevant stakeholders. 
  2. Validate the impact of wearable AR solutions with sensory-disabled people on crossing behaviour and experience through virtual reality-based simulation studies. 
  3. Produce (1) design guidelines of AR-based solutions for pedestrian-AV communication for sensory-disabled people and (2) a development plugin that all AR devices can adapt.  

Anticipated findings and contributions to knowledge:

Anticipated outcomes: dataset [Stage 1-3], research papers [Stage 1-3], and prototypes, plugin and design guidelines [Stage 3] 

Stage 1: Review of the Literature – A review of wearable AR solutions for disabled people and current on-vehicle eHMIs will be carried out to (1) identify, evaluate, and summarize the existing solutions, (2) determine what traffic scenarios have been considered, and (3) provide a dispassionate synthesis of the best available resources.  

Stage 2: User Requirements – Observation studies will be undertaken to understand how sensory-disabled people interact with existing wearable interfaces designed to support pedestrian-AV communication. Our dataset will be vital in understanding the sensory-disabled people’s needs and requirements for AR solutions to support pedestrian-AV communication.   

Stage 3: Solution Development and Design Guidelines Production – We will use the findings obtained from Stage 1 and the user requirements identified in Stage 2 to design and develop AR solutions that allow sophisticated interaction between sensory-disabled people and AVs. The primary output will be (1) findings of novel AR solutions to enable accessible pedestrian-AV communication for sensory-disabled people, (2) AR-based prototypes for pedestrian-AV communication, (3) an adaptable plugin that different AR devices can use, and (4) a set of design guidelines for accessible pedestrian-AV communications.  

Additional Information:

Reference list for project description: 

[1] HMGovernment. “Connected & Automated Mobility 2025: Realising the benefits of self-driving vehicles in the UK”. [Online]. Available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1099173/cam-2025-realising-benefits-self-driving-vehicles.pdf 

[2] James Uttley et al., “Road user interactions in a shared space setting: Priority and communication in a UK car park”, Transportation Research Part F: Traffic Psychology and Behaviour, v72, pp 32-46. Doi: https://doi.org/10.1016/j.trf.2020.05.004. 

[3] Md Enam et al., "Are the External Human-Machine Interfaces (eHMI) Accessible for People with Disabilities? A Systematic Review," 2024 IEEE 4th International Conference on Human-Machine Systems (ICHMS), Toronto, ON, Canada, 2024, pp. 1-6, https://doi.org/10.1109/ICHMS59971.2024.10555703 

[4] C. Creed, M. Al-Kalbani, A. Theil, S. Sarcar, Ian Williams. Inclusive Augmented and Virtual Reality: A Research Agenda. International Journal of Human–Computer Interaction. https://doi.org/10.1080/10447318.2023.2247614 (Impact Factor: 3.4) 

Person Specification:

Essential:  

  • A first degree in Computer Science, Human-Computer Interaction, or Engineering 
  • Strong programming skills in object-oriented programming 

Desirable: 

  • Prior knowledge in Human-Computer Interaction  
  • Experience with Unity 

Overseas applicants:

International applicants must also provide a valid English language qualification, such as International English Language Test System (IELTS) or equivalent with an overall score of 6.5 with no band below 6.0.  

Contact:

If you have any questions or need further information, please use the contact details below: 

- For enquiries about the funding or project proposal, please contact: wenge.xu@bcu.ac.uk

- For enquiries about the application process, please contact: research.admissions@bcu.ac.uk