prof_pic.png

Max Zimmer

Third-year PhD candidate in Mathematics at TU Berlin

Doctoral Researcher at the IOL Lab of Zuse Institute Berlin (ZIB)
Advisor: Prof. Dr. Sebastian Pokutta

Research: My research interests are centered around Machine Learning and Optimization in general. In particular, I am interested in Compression of Neural Networks, Explainability (XAI), Fairness in AI as well as first-order optimization.

Previously: Before joining IOL as a student researcher in 2020, I worked on Nash Flows over Time with Leon Sering at the COGA Group at TU Berlin. During my BSc and MSc in Mathematics at TU Berlin, I got the chance to intern in the research groups of Prof. Sergio de Rosa at Università degli Studi di Napoli Federico II and Prof. Marco Mondelli at IST Austria. Since 2022, I have been a member of the BMS graduate school, part of the MATH+ Cluster of Excellence.

04/2024 Our group was featured on german local TV regarding the use of generative AI! You can watch the interview with Sebastian and myself here (in german, available until 04/2026).
02/2024 Two papers accepted to conferences as posters: Sparse Model Soups: A Recipe for Improved Pruning via Model Averaging at ICLR24 and Interpretability Guarantees with Merlin-Arthur Classifiers at AISTATS24! I will update with camera-ready versions soon and will be in Vienna.
12/2023 Our new preprint is on arXiv: PERP: Rethinking the Prune-Retrain Paradigm in the Era of LLMs!
07/2023 We published a new paper: Sparse Model Soups: A Recipe for Improved Pruning via Model Averaging! You can find the corresponding blogpost here.
05/2023 I presented our poster “How I Learned to Stop Worrying and Love Retraining” at ICLR2023 in Kigali, Rwanda! You can find all information in the list of publications below.
  1. hadwigernelson.png
    Extending the Continuum of Six-Colorings
    arXiv preprint arXiv:2404.05509, 2024
  2. pde.png
    Neural Parameter Regression for Explicit Representations of PDE Solution Operators
    Konrad MundingerMax Zimmer, and Sebastian Pokutta
    ICLR 2024 Workshop on AI4DifferentialEquations In Science, 2024
  3. byzantine_feddistill.png
    On the Byzantine-Resilience of Distillation-Based Federated Learning
    Christophe RouxMax Zimmer, and Sebastian Pokutta
    arXiv preprint arXiv:2402.12265, 2024
  4. perp_dog.png
    PERP: Rethinking the Prune-Retrain Paradigm in the Era of LLMs
    Max Zimmer, Megi Andoni, Christoph Spiegel, and Sebastian Pokutta
    arXiv preprint arXiv:2312.15230, 2023
  5. sms_algo_sketch.png
    Sparse Model Soups: A Recipe for Improved Pruning via Model Averaging
    Max ZimmerChristoph Spiegel, and Sebastian Pokutta
    In International Conference on Learning Representations, 2024
  6. bimp_lr_schedules.png
    How I Learned To Stop Worrying And Love Retraining
    Max ZimmerChristoph Spiegel, and Sebastian Pokutta
    In International Conference on Learning Representations, 2023
  7. sparsefw_pruneddistance.png
    Compression-aware Training of Neural Networks using Frank-Wolfe
    Max ZimmerChristoph Spiegel, and Sebastian Pokutta
    arXiv preprint arXiv:2205.11921, 2022
  8. arthurmerlin.png
    Interpretability Guarantees with Merlin-Arthur Classifiers
    In International Conference on Artificial Intelligence and Statistics, 2024
  9. nashflow.png
    Flows over time as continuous limits of packet-based network simulations
    Theresa Ziemke, Leon Sering, Laura Vargas KochMax Zimmer, Kai Nagel, and Martin Skutella
    Transportation Research Procedia, 2021
  10. fw.png
    Deep Neural Network training with Frank-Wolfe
    Sebastian PokuttaChristoph Spiegel, and Max Zimmer
    arXiv preprint arXiv:2010.07243, 2020