prof_pic.png

Max Zimmer

Fourth-year PhD candidate in Mathematics at TU Berlin

Research Area Lead of iol.LEARN at the IOL Lab of Zuse Institute Berlin
Advisor: Prof. Dr. Sebastian Pokutta

Research: My research interests focus on Deep Learning, Efficient ML, and Optimization. Specifically, I work on enhancing the efficiency of large Neural Networks through sparsity, pruning, quantization, and low-rank optimization. Additionally, I am interested in Federated Learning, Explainability & Fairness, as well as applying ML to solve pure mathematics problems. Please take a look at my list of publications and feel free to reach out for questions or potential collaborations!

Previously: Before joining IOL as a student researcher in 2020, I worked on Nash Flows over Time with Leon Sering at the COGA Group at TU Berlin. During my BSc and MSc in Mathematics at TU Berlin, I got the chance to intern in the research groups of Prof. Sergio de Rosa at Università degli Studi di Napoli Federico II and Prof. Marco Mondelli at IST Austria. Since 2022, I have been a member of the BMS graduate school, part of the MATH+ Cluster of Excellence. You can find my full CV here.

latest news [see all]

11/2024 Our work on the Hadwiger-Nelson problem has been featured in the spanish newspaper El País!
10/2024 Happy to share that I am now the Research Area Lead of the Machine Learning subgroup of IOL, iol.LEARN, alongside Sai Ganesh Nagarajan!
07/2024 Join our Team in Berlin - We are seeking highly motivated PhD students to work on (efficient) Deep Learning, preferably with strong math/CS background and PyTorch experience. Happy to answer questions via email or at ICML2024 in Vienna! Directly apply here!

selected publications [see all]

  1. canopy_height.png
    Estimating Canopy Height at Scale
    Jan PaulsMax ZimmerUna M. Kelly, Martin Schwartz, Sassan Saatchi, Philippe CIAIS, Sebastian Pokutta, Martin Brandt, and Fabian Gieseke
    In Forty-first International Conference on Machine Learning, 2024
  2. hadwigernelson.png
    Extending the Continuum of Six-Colorings
    Geombinatorics Quarterly, 2024
  3. pde.png
    Neural Parameter Regression for Explicit Representations of PDE Solution Operators
    Konrad MundingerMax Zimmer, and Sebastian Pokutta
    ICLR 2024 Workshop on AI4DifferentialEquations In Science, 2024
  4. byzantine_feddistill.png
    On the Byzantine-Resilience of Distillation-Based Federated Learning
    Christophe RouxMax Zimmer, and Sebastian Pokutta
    arXiv preprint arXiv:2402.12265, 2024
  5. perp_dog.png
    PERP: Rethinking the Prune-Retrain Paradigm in the Era of LLMs
    Max Zimmer, Megi Andoni, Christoph Spiegel, and Sebastian Pokutta
    arXiv preprint arXiv:2312.15230, 2023
  6. sms_algo_sketch.png
    Sparse Model Soups: A Recipe for Improved Pruning via Model Averaging
    Max ZimmerChristoph Spiegel, and Sebastian Pokutta
    In International Conference on Learning Representations, 2024
  7. arthurmerlin.png
    Interpretability Guarantees with Merlin-Arthur Classifiers
    In International Conference on Artificial Intelligence and Statistics, 2024
  8. bimp_lr_schedules.png
    How I Learned To Stop Worrying And Love Retraining
    Max ZimmerChristoph Spiegel, and Sebastian Pokutta
    In International Conference on Learning Representations, 2023
  9. sparsefw_pruneddistance.png
    Compression-aware Training of Neural Networks using Frank-Wolfe
    Max ZimmerChristoph Spiegel, and Sebastian Pokutta
    arXiv preprint arXiv:2205.11921, 2022
  10. fw.png
    Deep Neural Network training with Frank-Wolfe
    Sebastian PokuttaChristoph Spiegel, and Max Zimmer
    arXiv preprint arXiv:2010.07243, 2020