Hi 👋, I’m Luis, a second year PhD student at RWTH Aachen University under the supervision of Christopher Morris.

I’m interested in studying the capabilities and limitations of general-purpose machine learning architectures in the context of graph learning. My current research focus is on deriving a principled understanding of graph transformers and their potential benefits over GNNs. Email:luis.mueller@cs.rwth-aachen.de.

Twitter GitHub LinkedIn

Preprints

Towards Principled Graph Transformers Luis Müller, Daniel Kusuma, Christopher Morris
We show that the Edge Transformer, a model originally proposed for improved systematic generalization over standard transformers, has provable 3-WL expressivity. We then demonstrate through a range of experiments on expressivity, molecular prediction and neural algorithmic reasoning benchmarks that the Edge Transformer matches or improves over SOTA graph learning models in terms of predictive performance.
Preprint Code

Publications

Attending to Graph Transformers Luis Müller, Michael Galkin, Christopher Morris, Ladislav Rampasek
We propose a taxonomy of graph transformers, overview their theoretical properties and investigate experimentally how well graph transformers can recover graph structure and mitigate issues with over-smoothing and over-squashing. Accepted at TMLR.
Arxiv Talk

Representation Change in Model-Agnostic Meta-Learning Thomas Goerttler, Luis Müller
Blog post, published at ICLR 2022 Blog Track.
Website

An Interactive Introduction to Model-Agnostic Meta-Learning Luis Müller, Max Ploner, Thomas Goerttler, Klaus Obermayer
Interactive explainer, published at VISxAI 2021.
Website Code

Reachable Assignments on Cycles Luis Müller, Matthias Bentert
Paper, published at ADT 2021 in Toulouse.
Preprint Conference Publication