About me

I am a PhD researcher in Machine Learning at Imperial College London supervised by Mark van der Wilk. I am also a visiting student at the University of Oxford. Previously, I worked as a research scientist at Babylon Health. Interned at Amazon Research.

Research

I am interested in the question of how machine learning systems can learn useful inductive biases for predicting well under varying conditions, a useful property for scientific discovery. Causal models impose a structure that allows for predicting well under distribution shifts. Hence, learning these causal models from data is an important problem. I am excited by the Bayesian approach to this problem. With its inherent Occam’s razor principle it allows for using flexible models, as well as for principled reasoning under uncertainty - both attractive properties for data driven causal modelling. In parallel, I investigate how deep learning and scaling theory can amplify these ideas and make them work in complex, real-world settings.

A non-exhaustive list of topics I am interested in:

  • Causality: causal discovery, effect estimation, causal representation learning
  • Model selection: Bayesian model selection, MDL
  • Bayesian methods: Bayesian deep learning, Gaussian processes, probabilistic models, inference methods
  • Generative models: Neural processes, Diffusion, flow matching
  • Deep learning: Local learning rules, generalisation, scaling theory, NTK, Tensor Programs
  • Information theory: Kolmogorov complexity, information bottleneck

News

  • May 2025: “Continuous Bayesian Model Selection for Multivariate Causal Discovery accepted at ICML 2025. We show that in the multivariate setting Bayesian model selection outperforms other methods in observational causal discovery tasks.
  • March 2025: Won the G-Research Early Career Researcher Grant.
  • Jan 2025: “A Meta-Learning Approach to Bayesian Causal Discovery” accepted at ICLR 2025. We propose a neural process that enables a foundation model like approach for causal discovery, trained entirely on synthetic data.
  • May - Nov 2024: Completed an internship at Amazon AGI Foundations working on scaling neural network architectures while preserving feature learning.