Table of Contents

Math and Machine Learning: Theory and Applications (Fall 2024)

Registration

Organization

Schedule

Week Date Time Location Speaker Topic
Week 45 (2024) Mon, 04.11.2024 15:00–16:30 MIS, E2 10 Diaaeldin Taha Graph and Topological Neural Networks I
Week 46 (2024) Mon, 11.11.2024 15:00–16:30 MIS, E2 10 Diaaeldin Taha Graph and Topological Neural Networks II
Week 47 (2024) Mon, 18.11.2024 15:00–16:30 MIS, E2 10 Parvaneh Joharinad Group Equivariant Neural Networks I
Week 48 (2024) Mon, 25.11.2024 15:00–16:30 MIS, E2 10 Parvaneh Joharinad Group Equivariant Neural Networks II
Week 49 (2024) Mon, 02.12.2024 15:00–16:30 MIS, G3 10 Nico Scherf Deep Generative Models
Week 51 (2024) Mon, 16.12.2024 15:00–16:30 MIS, E2 10 Jan Ewald On the (Underestimated) Importance of Objective/Loss Functions
Week 3 (2025) Mon, 13.01.2025 14:00–15:30 MIS, E2 10 Jan Ewald Autoencoder and Their Variants for Biomedical Data
Week 4 (2025) Mon, 20.01.2025 14:00–15:30 MIS, E2 10 Duc Luu Learning Dynamical Systems I
Week 5 (2025) Mon, 27.01.2025 14:00–15:30 MIS, E2 10 Duc Luu Learning Dynamical Systems II
Week 6 (2025) Mon, 03.02.2025 14:00–15:30 MIS, E2 10 Robert Haase Large Language Models for Code Generation
Week 7 (2025) Mon, 10.02.2025 14:00–15:30 MIS, E2 10 Guido Montufar Foundations of Feature Learning I
Week 8 (2025) Mon, 17.02.2025 14:00–15:30 MIS, E2 10 Guido Montufar Foundations of Feature Learning II
Week 9 (2025) Mon, 24.02.2025 14:00–15:30 MIS, E2 10 Paul Breiding Computing with Varieties I
Week 10 (2025) Mon, 03.03.2025 14:00–15:30 MIS, E2 10 Paul Breiding Computing with Varieties II
Week 11 (2025) Mon, 10.03.2025 14:00–15:30 MIS, E2 10 Angelica Torres Varieties in Machine Learning I
Week 12 (2025) Mon, 17.03.2025 14:00–15:30 MIS, E2 10 Angelica Torres Varieties in Machine Learning II
Week 13 (2025) Mon, 24.03.2025 14:00–15:30 MIS, E2 10 Marzieh Eidi Geometric Machine Learning

Information

Weeks 45 & 46 (2024)

Speaker: Diaaeldin Taha (Max Planck Institute for Mathematics in the Science, Germany)

Description: In these two sessions, we will provide an overview of deep learning with a focus on graph and topological neural networks. We will begin by reviewing neural networks, parameter estimation, and the universal approximation theorem. Then, we will discuss graphs and motivate graph convolutional neural networks by tracing their origins from spectral filters in signal processing. Lastly, we will review recent progress in topological deep learning, particularly focusing on simplicial, cellular, and hypergraph neural networks as extensions of graph neural networks. We will assume a basic familiarity with linear algebra and calculus; all relevant concepts from graph theory and topology will be introduced.

References:

Weeks 47 & 48 (2024)

Speaker: Parvaneh Joharinad

Week 49 (2024)

Speaker: Nico Scherf

Weeks 50 (2024) & 3 (2025)

Speaker: Jan Ewald

Weeks 4 & 5 (2025)

Speaker: Duc Luu

Weeks 6 (2025)

Speaker: Robert Hasse

Weeks 7 & 8 (2025)

Speaker: Guido Montufar

Weeks 9 & 10 (2025)

Speaker: Paul Breiding

Weeks 11 & 12 (2025)

Speaker: Angelica Torres

Weeks 13 (2025)

Speaker: Marzieh Eidi