Deep Learning–Based MultiSensor Change Detection for Monitoring LandCover Dynamics
Doctoral Training Grant Funding Information
This funding model includes a 36 month fully funded PhD Studentship, set in-line with UK Research & Innovation values. For 2025/6, this will be £20,780 per year. The tax-free stipend will be paid monthly. This PhD Studentship also includes a Full-Time Fee Scholarship for up to 3 years. The funding is subject to your continued registration on the research degree, making satisfactory progression within your PhD, as well as attendance on and successful completion of the Postgraduate Certificate in Research Practice.
All applicants will receive the same stipend irrespective of fee status.
Application Closing Date:
Midday (UK Time) on Wednesday 17th September 2025 for a start date of 2nd February 2026.
How to Apply
To apply, please follow the below steps:
- Complete the BCU Online Application Form.
- Complete the Doctoral Studentship Proposal Form in full, ensuring that you quote the project ID. You will be required to upload your proposal in place of a personal statement on the BCU online application form.
- Upload two references to your online application form (at least one of which must be an academic reference).
- Upload your qualification(s) for entry onto the research degree programme. This will be Bachelor/Master’s certificate(s) and transcript(s).
- International applicants must also provide a valid English language qualification. Please see the list of English language qualifications accepted here. Please check the individual research degree course page for the required scores.
Frequently Asked Questions
To help support you to complete your application, please consult the frequently asked questions below:
Project title: Deep Learning–Based MultiSensor Change Detection for Monitoring LandCover Dynamics
Project Lead: Dr. Mohamed Ihmeida
Project ID: 03 - 45819641
Project description:
Recent disasters from the Santa Ana wind–driven Palisades and Eaton wildfires in Los Angeles County (January 2025) to the magnitude6.8 earthquake in Morocco’s High Atlas region (September 2023) have underlined the urgent need for rapid, accurate Earth change mapping to save lives and limit property damage. Current emergency response relies on “before and after” maps derived largely from single sensor optical imagery, which can be obscured by cloud cover, seasonal changes and varying illumination. Radar approaches, while weather independent, lack the spectral detail required for precise classification. This project will overcome these limitations by developing a novel, multisensor deep learning framework that fuses optical (Sentinel2), synthetic aperture radar (Sentinel1) and LiDAR derived elevation data to deliver near real time change detection across heterogeneous landscapes.
Objectives
-
Data Integration and Preprocessing
Assemble and harmonise timeseries stacks from Sentinel2 optical scenes, Sentinel1 backscatter maps and LiDAR derived digital elevation models across diverse environments (tropical rainforests, suburban fringes, agricultural regions). Preprocessing steps will include radiometric calibration, spatial resampling and coregistration to ensure seamless fusion of multimodal data.
-
Model Development
Design a bespoke deep neural network incorporating:
-
Convolutional layers for spatial feature extraction
-
Recurrent modules to track temporal dynamics
-
Attention mechanisms that dynamically weight each sensor input, prioritising radar data under cloudy or lowlight conditions and optical bands when visibility allows
This architecture will deliver robust, context aware change detection in rapidly evolving environments.
-
Uncertainty Quantification
Integrate Bayesian deep learning techniques or model ensembles to produce calibrated confidence scores for every detected change. Providing not only “what changed” but “how certain we are” will empower emergency managers and policymakers to direct resources more effectively.
-
Benchmarking and Validation
-
Software Delivery and Dissemination
Package the framework into a user-friendly software suite with comprehensive documentation, step by step tutorials and exemplar workflows. This delivery will facilitate rapid adoption by environmental agencies, humanitarian organisations and the research community, ensuring easy adaptation to new monitoring challenges.
Impact and Innovation:
By pioneering a fully automated, uncertainty aware multisensor approach, this project will significantly advance the state of the art in Earth change mapping. Emergency responders will gain access to more reliable, timely “before and after” maps, enhancing evacuation planning, resource allocation and damage assessment. Beyond disaster response, the framework’s adaptability to diverse landscapes will support ongoing environmental monitoring informing policy decisions on deforestation, urban growth and climate change resilience. Ultimately, this research will equip decisionmakers with clearer, more trustworthy insights into how our world is evolving.
Anticipated findings and contributions to knowledge:
We anticipate three principal contributions:
Anticipated Findings:
• Uncertainty‑aware deep‑learning fusion pipeline for multi‑sensor change detection, combining SAR despeckling, LiDAR registration and attention mechanisms to deliver more reliable change maps.
• Open‑source data libraries and codebase, including pre‑processed multi‑temporal SAR–LiDAR benchmark datasets, model implementations and evaluation scripts.
• Comprehensive tutorials and example workflows that demonstrate end‑to‑end training, inference and uncertainty quantification on real‑world satellite data.
Contribution to New Knowledge
1. Technological Innovation & Funding Impact:
Delivering a robust fusion pipeline will underpin a future NERC Standard Grant or UKRI Future Leaders Fellowship, extending the methods to disaster‑response scenarios and enabling real‑time mapping partnerships with emergency agencies.
2. Educational & Community Resource
The codebase, tutorials and workflows will form the heart of an MSc‑level teaching module in remote sensing and machine learning. They will also feed directly into BSc and MSc dissertations, enabling students to apply cutting‑edge fusion methods without building pipelines from scratch.
3. Curriculum Integration
We will supply ready‑made case studies and lab exercises for:
• BSc Computer Science with AI (Computer Vision topics)
• BSc Computer Science
• MSc Artificial Intelligence
• MSc Computer Science
• MSc Big Data Analytics
These materials will slot into existing modules, enriching project options and practical assessments.
4. Scholarly Output & Reproducibility:
We will publish our algorithm and benchmarking results in a leading journal (e.g. Remote Sensing of Environment or IEEE T‑GRS) and host the full implementation on GitHub under an open licence, ensuring transparent evaluation, community uptake and accelerated follow‑on research.
Person Specification:
Applicants should hold a merit-level master’s degree (or international equivalent) in Computer Science, Artificial Intelligence/Machine Learning, Remote Sensing, Mathematics, Statistics, Engineering or a closely related discipline; candidates without a master’s may be considered exceptionally if they possess a first-class undergraduate degree and can demonstrate readiness for doctoral-level research. They must have demonstrable experience or keen interest in deep learning and generative models applied to multi-sensor Earth change detection, fusing optical (e.g. Sentinel 2), SAR (e.g. Sentinel 1) and LiDAR data to map wildfires, earthquakes, deforestation and urban growth. Proven proficiency in Python (with PyTorch or TensorFlow) and scientific libraries (NumPy, pandas, SciPy) is essential.
It would be preferred to have hands-on familiarity with geospatial processing tools (GDAL, Rasterio, QGIS/ArcGIS) and techniques for radiometric calibration, co-registration and time series stacking.
Overseas applicants:
International applicants must also provide a valid English language qualification, such as International English Language Test System (IELTS) or equivalent with an overall score of 6.5 with no band below 6.0.
Contact:
If you have any questions or need further information, please use the contact details below:
- For enquiries about the funding or project proposal, please contact: mohamed.ihmeida@bcu.ac.uk
- For enquiries about the application process, please contact: research.admissions@bcu.ac.uk