Aaron Klein

profile.jpg

Machine learning is one of the most disruptive technologies of our time. However, a recurring criticism of machine learning is that, in practice, training a model is often more of an art than a science.

The goal of my research is to develop machine learning approaches that can configure themselves automatically by learning from past data. In the long term, I hope that my work will contribute to fulfilling the promise of artificial intelligence and lead to complete end-to-end learning systems.

More specifically, my research interests include:

  • Automated machine learning
  • Neural architecture search
  • Bayesian optimization
  • Resource efficient large language models

short bio

I am a senior scientist at AWS, where I work on research topics related to automated machine learning and large language models. Alongside my collaborators, I lead the development of the open-source library SyneTune for large-scale hyperparameter optimization and neural architecture search.

Prior to that, I finished my PhD at the University of Freiburg under the supervision of Frank Hutter in 2019. My collaborators from the University of Freiburg and I won the ChaLearn AutoML Challenge in 2015. I co-organized the workshop on neural architecture search at ICRL 2020 and ICLR 2021, respectively and served as local chair for the AutoML Conference in 2023. I also regularly serve as a reviewer for ICML, ICLR, NeurIPS, TMLR, and JMLR.

Together with Giovanni Zappella, Arber Zela, and Jovita Lukasik, I run the virtual AutoML Seminar as part of the ELLIS units in Berlin and Freiburg. The AutoML seminar serves as a platform to discuss and inform about recent advancements in Automated Machine Learning, hosting regular (bi-weekly) online sessions with talks and discussions. A particular focus is to provide aspiring young researchers the opportunity to present their work to a larger audience, consisting of renowned experts in the field.

I strongly believe in open-source for reproducible research. Throughout my career as a scientist, I have contributed to and maintained several open-source libraries such as auto-sklearn, EmuKit, RoBO or SyneTune. Additionally, I authored a chapter on hyperparameter optimization for the well-known open book “Dive into Deep Learning.”

news

Feb 01, 2024 I am invited as a keynote speaker at the AutoML Conference 2024
Jan 15, 2024 I will give a tutorial on hyperparameter search and neural architecture search at the MESS 2024 summer school
Dec 12, 2023 I will give a talk at BLISS about Automated Machine Learning
Jun 18, 2023 Our paper on Optimizing Hyperparameters with Conformal Quantile Regression got accepted at ICML 2023.
Feb 18, 2023 I will give a talk at the SIAM23 Minisymposia on Bayesian Optimization in the Real World