Research Opportunities
Research Fellow in Artificial Intelligence for Imaging
We have a vacancy for a Research Fellow to join the Department of Space and Climate Physics (Mullard Space Science Laboratory), University College London (UCL) to collaborate further with the STFC Scientific Machine Learning research group. The Department is inviting applications from excellent candidates for a Research Fellowship position in an exciting new multi-disciplinary project on Learned Exascale Computation Imaging (LEXCI).
The LEXCI project will develop a new paradigm of exascale computational imaging, integrating hybrid deep learning and model based approaches with uncertainty quantification at large scale and in distributed environments. The methodology developed will have widespread application in many fields. During the project it will be applied to help to unlock the secrets of the Universe by imaging observations from the next-generation of radio interferometric telescopes and to image the neuronal pathways in the human brain through diffusion MRI.
LEXCI is driven by a multi-disciplinary team of experts in machine learning, statistics, applied mathematics, physics, high-performance computing, and software research engineering, led by Prof. Jason McEwen (UCL MSSL), Assoc. Prof. Marta Betcke (UCL CS), Rev. Dr Jeremy Yates (UCL CS), and Assoc. Prof. Marcelo Pereyra (Heriot-Watt University). In addition, the successful candidate will collaborate closely with the AI for Science initiatives and Scientific Machine Learning Research Group within the Rutherford Appleton Laboratory (RAL), Science and Technology Facilities Council, led by Dr Jeyan Thiyagalingam, and will have the opportunity for extended visits to RAL to cement and grow collaborations between different communities.
This post is available to start as soon as possible and is funded for 24 months in the first instance.
Further details and the application link (submission deadline 13 March 2023) are available here.
I very much encourage young researchers to apply for postdoctoral fellowships. I am happy to support and assist strong candidates that would like to apply for fellowships with MSSL as the host institution.
If you are interested in discussing this further then please email me, including '[Fellowship enquiry]' in the subject of your email. I receive many enquires and so will only reply if your expertise are well matched to my research interests and there is a high chance of submitting a successful application.
More information on various fellowships can he found here:
- Royal Society (RS) University Research Fellowship (URF)
- Royal Society (RS) Newton International Fellowships (for researchers coming from abroad)
- Royal Society (RS) Dorothy Hodgkin Fellowships
- STFC Ernest Rutherford Fellowships (ERF)
- Royal Astronomical Society (RAS) Fellowships
- Leverhulme Trust Early Career Fellowships (ECF)
- Royal Commission for the Exhibition of 1851 Fellowships
- Marie Curie Fellowships (for researchers coming from Europe)
- Daphne Jackson Fellowships (for researchers returning from a career break)
The PhD projects that I offer are typically multi-disciplinary and include a combination of cosmology, statistics, and informatics (e.g. machine learning, signal processing, harmonic analysis, etc.). A relatively strong mathematical background is usually required for these types of projects. Strong programming skills are also an advantage.
If you are interested in discussing PhD projects further then please email me, including '[PhD enquiry]' in the subject of your email, and attach a CV. I receive many enquires and so will only reply if your expertise are well matched to my research interests and there is a high chance of submitting a successful application.
Further information on how to submit an official application to MSSL can be found here.
Further information on how to submit an official application via the UCL CDT in Data Intensive Science can be found here.
Brief overviews of current projects on offer are given below
PhD project: Probabilistic deep learning for cosmology and beyond
The current evolution of our Universe is dominated by the influence of dark energy and dark matter, which constitute 95% of its content. However, an understanding of the fundamental physics underlying the dark Universe remains critically lacking. Forthcoming experiments have the potential to revolutionalise our understanding of the dark Universe. Both the ESA Euclid satellite and the Rubin Observatory Legacy Survey of Space and Time (LSST) will come online imminently, with Euclid scheduled for launch in 2023 and the Rubin LSST Observatory having recently achieved first light. Furthermore, the Simons Observatory is in advanced stages of construction. Sensitive statistical and deep learning techniques are required to extract cosmological information from weak observational signatures of dark energy and dark matter.
The classical approach of deep learning is to make single predictions. A single estimate of a quantity of interest, such as an image, is typically made. For robust scientific studies, however, single estimates are not sufficient and a principled statistical assessment is critical in order to quantify uncertainties. Bayesian inference provides a principled statistical framework in which to perform scientific analyses. In cosmology, in particular, Bayesian inference is the bedrock of most cosmological analyses. While such approaches provide a complete statistical interpretation of observations, which is critical for robust and principled scientific studies, they are typically computationally slow, in many cases prohibitively so. Furthermore, in such analyses prior information typically cannot be injected by a deep data-driven approach.
In the proposed project we will develop probabilistic deep learning approaches, where probabilistic components are incorporated as integral components of deep learning models. Similarly, we will also develop statistical analysis techniques for which deep learning components are incorporated as integral components. This deep hybrid approach, where statistical and deep learning components are tightly coupled in integrated approaches, rather than considered as add-ons, will allow us to realise the complementary strengths of these different approaches simultaneously.
Specifically, we will develop novel probabilistic deep learning models, variational inference techniques and simulation-based inference approaches. These new methodologies will be applied to various cosmological problems and probes, focusing on the cosmic microwave background and weak gravitational lensing, and will include generative models for emulation and inference approaches for the estimation of not only the parameters of cosmological models but also to assess the most effective models and physical theories for describing our Universe.
The student should have a strong mathematical background and be proficient in coding, particularly in Python. The student will gain extensive expertise during the project in deep learning, going far beyond the straightforward application of existing deep learning techniques, instead focusing on the construction on novel probabilistic deep learning approaches and their application to novel problems in cosmology and beyond. The expertise gained in foundational deep learning will prepare the student well for a future career either in academia or industry. In particular, the emerging field of probabilistic deep learning is a speciality highly sought after in industry by many companies, such as Google/DeepMind, Facebook, Amazon and many others.
PhD project: Global weather forecasting with geometric deep learning
Numerical weather prediction has historically focussed on the simulation of atmospheric physics across the Earth. Classical numerical weather forecasting methods are physically motivated, highly interpretable but are prohibitively computationally expensive, and can induce parameterisation biases. These biases can often be severe, particularly in forecasting of extreme precipitation events, which can lead to flash flooding. Recently, deep learning techniques have emerged as an alternative approach that is far more efficient computationally, avoids parameterisation biases, and can model non-linear dynamics in a data-driven manner. Importantly, deep learning approaches also facilitate the generation of prediction ensembles, from which one may consider probabilistic forecasting and the construction of digital twins. However, existing deep learning approaches to global weather prediction are based on standard planar deep learning techniques and do not account for the spherical geometry of the Earth.
Deep learning has been remarkably successful in the interpretation of standard (Euclidean) data, such as 1D time series data, 2D image data, and 3D video or volumetric data, now exceeding human accuracy in many cases. However, standard deep learning techniques fail catastrophically when applied to data defined on other domains, such as data defined over networks, 3D objects, or other manifolds such as the sphere. This has given rise to the field of geometric deep learning (Bronstein et al. 2017; Bronstein et al. 2021).
Geometric deep learning techniques constructed natively on the sphere are essential for next-generation global weather prediction models. McEwen and collaborators have recently developed efficient generalised spherical convolutional neutral networks (Cobb et al. 2021) and spherical scattering networks (McEwen et al. 2022) that have shown exceptional performance. In their latest work they have developed the DISCO framework that is for the first time scalable to high resolution data (Ocampo et al. 2022) opening up dense prediction tasks like weather prediction. Their DISCO framework provide a saving in computation requirements of 9 orders of magnitude and a saving in memory requirements of 4 orders of magnitude. Moreover, it provides state-of-the-art accuracy in all benchmark problems considered to date.
In this project we will develop deep learning networks that can forecast weather systems natively over the spherical globe, without the need for projections, leveraging the very recent developments discussed above for the construction of scalable geometric deep learning approaches on the sphere. Such networks will be geographically unbiased, scalable to sub-kilometre resolution, and robust, with the potential to dramatically improve weather predictions. Due to the changing climate, extreme weather events are becoming increasingly common. While we must address the root causes of climate change, it is also essential to better predict extreme weather events in order to reduce their harmful impact on the planet and humanity. Often it is societies in the developing work that are least responsible for climate change and least able to deal with its effect that are most impacted. Given the importance of weather prediction, these next-generation geometric deep learning techniques will have significant societal and scientific impact in years to come.
See the poster below for further details.
PhD project: Neural Bayesian model selection
While deep learning has undergone remarkable progress in recent years, it typically lacks a principled statistical basis. Probabilistic deep learning has emerged recently but the field remains nascent. And while the outputs of probabilistic deep learning approaches may exhibit the mathematical properties of probabilities, they usually do not satisfy the rigorous statistical interpretation of either frequentist or Bayesian statistical frameworks. Moreover, in science the most pertinent questions are often those of model selection, e.g. what is the nature of dark energy and dark matter, what is the origin of structure in our Universe, what is the best physical model to describe gravitational wave signals from black hole mergers? Model selection is typically neglected in probabilistic deep learning frameworks. In this project we will develop deep learning approaches to Bayesian model selection.
Bayesian model selection requires the computation of the marginal likelihood (aka. model evidence), the average likelihood of a model over its prior probability space, which is an extremely challenging computational problem. We will develop deep learning techniques integrated into Markov chain Monte Carlo (MCMC) sampling approaches in order to: (i) accelerate the computation of the marginal likelihood to scale to high dimensional settings; and (ii) incorporate learned data-driven priors. To achieve these goals we will develop techniques leveraging normalizing flows and generative diffusion models. Furthermore, we will develop techniques to compute the marginal likelihood in variational inference frameworks in order to provide dramatically accelerated model selection. While the focus of the project is on methodological developments, we will showcase the application of the techniques developed on cosmological problems to address the science questions posed above.
The student should have a strong mathematical background and be proficient in coding, particularly in Python. Expertise in either statistics or deep learning is an advantage (cosmological background not required).
I am not offering any additional Masters projects at present but when I am they will appear here.
I am not offering any specific internship projects at present. However, if you are interested in discussing internship possibilities further then please email me, including '[Internship enquiry]' in the subject of your email, and attach a CV. I receive many enquires and so will only reply if your expertise are well matched to my research interests and there is a high chance of a placement.