
Max Zimmer
4th-year PhD candidate in Mathematics at TU Berlin
Advisor: Prof. Dr. Sebastian Pokutta
Research: My research primarily focuses on developing methods to enhance the compute and memory efficiency of large Neural Networks through techniques like sparsity, pruning, and quantization. While this work aims to make Deep Learning more sustainable, I also leverage large models to address sustainability challenges, such as creating global canopy height maps for monitoring deforestation and forest degradation. I’m also interested in using Deep Learning for scientific discovery, particularly for solving pure mathematical problems. Additionally, I worked on Federated Learning, Interpretability, and Optimization. Please take a look at my list of publications and feel free to reach out for questions or potential collaborations!
Previously: Before joining IOL as a student researcher in 2020, I worked on combinatorial optimization problems with Leon Sering at the COGA Group at TU Berlin. During my BSc and MSc in Mathematics at TU Berlin, I got the chance to intern in the research groups of Prof. Sergio de Rosa at Università degli Studi di Napoli Federico II and Prof. Marco Mondelli at IST Austria. Since 2022, I have been a member of the BMS graduate school, part of the MATH+ Cluster of Excellence. You can find my full CV here.
latest news [see all]
02/2025 | Two new preprints on arXiv! |
---|---|
01/2025 | On the Byzantine-Resilience of Distillation-Based Federated Learning has been accepted at ICLR25! |
11/2024 | Our work on the Hadwiger-Nelson problem has been featured in the spanish newspaper El País! |
selected publications [see all]
- Workshop ICLR25 Workshop on Sparsity in LLMs (SLLM) 2025
- PreprintarXiv preprint arXiv:2502.17066 2025
- PreprintarXiv preprint arXiv:2501.19328 2025
- PreprintarXiv preprint arXiv:2501.19328 2025
- PreprintarXiv preprint arXiv:2501.18527 2025
- ICML24 Forty-first International Conference on Machine Learning 2024
-
- Workshop ICLR24 Workshop on AI4DifferentialEquations In Science 2024
- PreprintarXiv preprint arXiv:2312.15230 2023
- PreprintarXiv preprint arXiv:2205.11921 2022