Hi 👋, I’m Luis, a second year PhD student at RWTH Aachen University under the supervision of Christopher Morris.

I’m interested in studying the capabilities and limitations of general-purpose machine learning architectures in the context of graph learning. My current research focus is on deriving a principled understanding of graph transformers and their potential benefits over GNNs. Email:luis.mueller@cs.rwth-aachen.de.

Twitter GitHub LinkedIn

Preprints

Towards Principled Graph Transformers
Luis Müller, Daniel Kusuma, Christopher Morris
We show that the Edge Transformer, a model originally proposed for improved systematic generalization over standard transformers, has provable 3-WL expressivity. We then demonstrate through a range of experiments on expressivity, molecular prediction and neural algorithmic reasoning benchmarks that the Edge Transformer matches or improves over SOTA graph learning models in terms of predictive performance.
Preprint Code

Publications

Attending to Graph Transformers
Luis Müller, Michael Galkin, Christopher Morris, Ladislav Rampasek
We propose a taxonomy of graph transformers, overview their theoretical properties and investigate experimentally how well graph transformers can recover graph structure and mitigate issues with over-smoothing and over-squashing. Accepted at TMLR.
Arxiv Talk

Towards Foundational Models for Molecular Learning on Large-Scale Multi-Task Datasets
Dominique Beaini, Shenyang Huang, Joao Alex Cunha, Zhiyi Li, Gabriela Moisescu-Pareja, Oleksandr Dymov, Samuel Maddrell-Mander, Callum McLean, Frederik Wenkel, Luis Müller, Jama Hussein Mohamud, Ali Parviz, Michael Craig, Michał Koziarski, Jiarui Lu, Zhaocheng Zhu, Cristian Gabellini, Kerstin Klaser, Josef Dean, Cas Wognum, Maciej Sypetkowski, Guillaume Rabusseau, Reihaneh Rabbany, Jian Tang, Christopher Morris, Ioannis Koutis, Mirco Ravanelli, Guy Wolf, Prudencio Tossou, Hadrien Mary, Therence Bois, Andrew Fitzgibbon, Błażej Banaszewski, Chad Martin, Dominic Masters
We release a large-scale dataset for supervised multi-level multi-task molecular pre-training with more than 13 billion prediction targets, accompanied by Graphium a new library for training and fine-tuning GNNs at scale on both GPUs and IPUs. Accepted at ICLR 2024.
Arxiv

Representation Change in Model-Agnostic Meta-Learning
Thomas Goerttler, Luis Müller
Blog post, published at ICLR 2022 Blog Track.
Website

An Interactive Introduction to Model-Agnostic Meta-Learning
Luis Müller, Max Ploner, Thomas Goerttler, Klaus Obermayer
Interactive explainer, published at VISxAI 2021.
Website Code

Reachable Assignments on Cycles
Luis Müller, Matthias Bentert
Paper, published at ADT 2021 in Toulouse.
Arxiv Conference Publication