About
I am a fourth-year DPhil student in the Machine Learning Research Group at the University of Oxford, and a member of the Autonomous Intelligent Machines and Systems Centre for Doctoral Training (AIMS CDT). I am supervised by Xiaowen Dong and Michael Bronstein.
My research interests include:
- Graph neural networks (message passing NNs and graph Transformers), particularly long-range interactions and the over-smoothing and -squashing problems
- LLMs, particularly in-context learning and long-context
- AI for document processing
- Deep learning for biology
- Inductive biases in deep learning
I previously worked on the HumBug project, which uses deep learning to detect and identify mosquito species by their flight tone using budget smartphones. I completed my MEng in Engineering Science, specialising in information engineering, at the University of Oxford, and was supervised by Michael Osborne in my fourth-year project.
In my spare time I enjoy reading, playing D&D, and exploring new pubs, especially those with real ale. I maintain a table of ratings and useful information about every pub in Oxford here.
News
17/12/2024: Judge a Book by Its Cover: Investigating Multi-Modal LLMs for Multi-Page Handwritten Document Transcription accepted to the AAAI-25 Workshop on Document Understanding and Intelligence (DocUI@AAAI-25) in Philadelphia
10/06/2024: Started internship in the Deep Learning team at QuantCo working on DocAI and LLM long-context reasoning
03/03/2024: Started Visiting Data Scientist internship at BCG X, working on the ‘Pathfinder’ flight schedule optimiser at British Airways
06/08/2023: DRew covered in round-up blog post by @michael_galkin: Graph Machine Learning @ ICML 2023
28/07/2023: DRew discussed in keynote talk by @mmbronstein at TAG-ML workshop at ICML 2023.
19/06/2023: Presented DRew to Learning on Graphs and Geometry (LoGG) reading group (recording here). Blog post based on DRew published on Towards Data Science.
23/04/2023: First-author paper, “DRew: Dynamically Rewired Message Passing with Delay”, accepted to ICML 2023.