Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
mathematics_machine_learning_internship_summer_2025 [2025/03/26 13:44] – [Project 2: Causal Deep Learning] Diaaeldin Tahamathematics_machine_learning_internship_summer_2025 [2025/04/02 14:46] (current) – [Calendar] Diaaeldin Taha
Line 1: Line 1:
-====== Math & ML Internship (Summer 2025) ======+====== Mathematics and Machine Learning Internship (Summer 2025) ======
  
 ===== Organization ===== ===== Organization =====
Line 5: Line 5:
   * **Organizer**: Diaaeldin Taha (''taha [at] mis [dot] mpg [dot] de'')   * **Organizer**: Diaaeldin Taha (''taha [at] mis [dot] mpg [dot] de'')
   * **Time and Location**: Two internship-wide organizational meetings will take place at **MIS MPI A3 01, Thursdays 11:15 - 12:45**. Instructions for accessing A3 01 are available [[https://labwiki.mis.mpg.de/lib/exe/fetch.php?media=a3_01_coordinates.pdf|here]]; doors will automatically open 15 minutes before the meetings. Subsequent meetings will be arranged individually with each project mentor   * **Time and Location**: Two internship-wide organizational meetings will take place at **MIS MPI A3 01, Thursdays 11:15 - 12:45**. Instructions for accessing A3 01 are available [[https://labwiki.mis.mpg.de/lib/exe/fetch.php?media=a3_01_coordinates.pdf|here]]; doors will automatically open 15 minutes before the meetings. Subsequent meetings will be arranged individually with each project mentor
 +  * **Moodle**: TBA
   * **Module Description**: Available [[https://labwiki.mis.mpg.de/lib/exe/fetch.php?media=math_and_ml_praktikum_description.pdf|here]].   * **Module Description**: Available [[https://labwiki.mis.mpg.de/lib/exe/fetch.php?media=math_and_ml_praktikum_description.pdf|here]].
   * **Course Plan Entry**: [[https://www.informatik.uni-leipzig.de/~stundenplan/modul.html#studium/MPI.MaML|MPI.MaML]]   * **Course Plan Entry**: [[https://www.informatik.uni-leipzig.de/~stundenplan/modul.html#studium/MPI.MaML|MPI.MaML]]
   * **Study Programs**:   * **Study Programs**:
     * B.Sc. Informatik 6. Semester [Kernmodul]     * B.Sc. Informatik 6. Semester [Kernmodul]
 +    * B.Sc. Mathematik [Projektpraktikum]
     * M.Sc. Data Science 2. Semester [Wahlpflichtbereich Datenanalyse]     * M.Sc. Data Science 2. Semester [Wahlpflichtbereich Datenanalyse]
     * M.Sc. Informatik 2. Semester [Kernmodul]     * M.Sc. Informatik 2. Semester [Kernmodul]
-  **Registration**: Email the organizer to register or express interest.+    Diplom Mathematik [Seminarschein]
  
 ===== Overview ===== ===== Overview =====
  
-This is the first iteration of the "Mathematics and Machine Learning Praktikum," organized between the [[https://www.mis.mpg.de/|Max Planck Institute for Mathematics in the Sciences (MPI MIS)]], the [[https://scads.ai/|Center for Scalable Data Analytics and Artificial Intelligence (ScaDS.AI)]], and [[https://www.uni-leipzig.de/en|Leipzig University]]. The internship involves designing, analysing, and implementing algorithms and models at the intersection of mathematics and machine learning. Several projects will be offered, which are worked on in small groups of 1–participants.+This is the first iteration of the "Mathematics and Machine Learning Praktikum," organized between the [[https://www.mis.mpg.de/|Max Planck Institute for Mathematics in the Sciences (MPI MIS)]], the [[https://scads.ai/|Center for Scalable Data Analytics and Artificial Intelligence (ScaDS.AI)]], and [[https://www.uni-leipzig.de/en|Leipzig University]]. The internship involves designing, analysing, and implementing algorithms and models at the intersection of mathematics and machine learning. Several projects will be offered, which are worked on in small groups of 1–participants.
  
 The deliverables include: The deliverables include:
Line 26: Line 28:
 ===== Calendar ===== ===== Calendar =====
  
-  * **Organizational meeting 1/2**: MPI MIS A3 01, Thu 09.04.2025, 11:15 - 12:45 (tentative) +  * **Organizational meeting 1/2**: MPI MIS A3 01, Thu 10.04.2025, 11:15 - 12:45 (tentative) 
-  * **Organizational meeting 2/2**: MPI MIS A3 01, Thu 16.04.2025, 11:15 - 12:45 (tentative)+  * **Organizational meeting 2/2**: MPI MIS A3 01, Thu 17.04.2025, 11:15 - 12:45 (tentative)
   * **Mid-semester presentations**: TBA   * **Mid-semester presentations**: TBA
   * **End-of-semester presentations**: TBA   * **End-of-semester presentations**: TBA
Line 34: Line 36:
 ===== Topics (Tentative) ===== ===== Topics (Tentative) =====
  
-This list may be updated with more projects before the organizational meeting. Participants who want to propose projects that fit the scope of the internship can contact the organizer.+This list may be updated with more projects before the organizational meeting. Participants who want to propose projects that fit the scope of the internship can contact the organizer no later than the first organizational meeting.
  
 ==== Project 1: AI 4 Mathematics ==== ==== Project 1: AI 4 Mathematics ====
Line 50: Line 52:
 **References**: **References**:
   * Davies, Alex, et al. "Advancing mathematics by guiding human intuition with AI." Nature 600.7887 (2021): 70-74.   * Davies, Alex, et al. "Advancing mathematics by guiding human intuition with AI." Nature 600.7887 (2021): 70-74.
 +  * Wagner, Adam. “Finding counterexamples with reinforcement learning.” (2021).
  
 ==== Project 2: Causal Deep Learning ==== ==== Project 2: Causal Deep Learning ====
Line 57: Line 60:
 **Members**: TBA **Members**: TBA
  
-**Description**: Correlations indicate patterns in data, but causation goes further by revealing how changes in one variable truly affect another, and distinguishing genuine causal relationships from mere correlations is often subtle and challengingThis project introduces participants to causal deep learning methods, covering core concepts such as causal discovery and causal effect estimation. Participants will learn key techniques from recent literature, implement these methods, and gain practical experience working directly with causal models.+**Description**: Whereas correlation is concerned with patterns between variables in data, causation is concerned with how changes in one variable influence another. While correlations can be learned directly from datauncovering causal structure often requires subtle assumptions and careful reasoningCausal deep learning aims to offer tools to navigate these challenges by combining data-driven deep models with causal discovery and effect estimation. In this project, participants will get familiar with the basics of causal deep learning, implement selected methods from recent literature, and gain hands-on experience reasoning about cause and effect in data.
  
 **Prerequisites**: **Prerequisites**:
Line 67: Line 70:
   * Kaddour, Jean, et al. "Causal machine learning: A survey and open problems." arXiv preprint arXiv:2206.15475 (2022).   * Kaddour, Jean, et al. "Causal machine learning: A survey and open problems." arXiv preprint arXiv:2206.15475 (2022).
  
 +
 +==== Project 3: Optimization Landscapes in Machine Learning ====
 +
 +**Mentor**: Diaaeldin Taha
 +
 +**Members**: TBA
 +
 +**Description**: Modern machine learning models are trained by optimizing highly non-convex loss functions, yet in practice, simple gradient-based methods often work remarkably well. This project investigates the geometry of these optimization landscapes: how structure, symmetry, and overparameterization shape the behavior of gradient descent. Participants will explore recent theoretical and empirical work connecting optimization dynamics to generalization and model performance. The goal is to implement simple model families, visualize their loss surfaces, and analyze how different training regimes (e.g. width, initialization, learning rate) interact with the landscape geometry.
 +
 +**Prerequisites**:
 +  * Familiarity with gradient descent and basic optimization theory.
 +  * Curiosity about the interplay between learning dynamics and mathematical structure.
 +
 +**References**:
 +  * Li, Hao, et al. “Visualizing the Loss Landscape of Neural Nets.” NeurIPS (2018).
 +  * Sagun, Levent, et al. “Eigenvalues of the Hessian in Deep Learning: Singularity and Beyond.” arXiv:1611.07476 (2016).
 +  * Chizat, L., & Bach, F. “On the Global Convergence of Gradient Descent for Over-parameterized Models using Optimal Transport.” (NeurIPS 2018)
 +  * Fort, Stanislav, et al. “Deep Learning versus Kernel Learning: an Empirical Study of Loss Landscape Geometry and the Time Evolution of the Neural Tangent Kernel.” NeurIPS (2020).
 +
 +==== Project 4: Learning on Manifolds ====
 +
 +**Mentor**: Diaaeldin Taha
 +
 +**Members**: TBA
 +
 +**Description**: In many real-world applications, data lie on non-Euclidean spaces, such as spheres, tori, or hyperbolic surfaces, rather than in flat, high-dimensional vector spaces. This project explores how to model data that lives on non-Euclidean space. Participants will learn how to implement models that respect or exploit the underlying structure (e.g., Riemannian gradient descent, manifold-aware neural networks, or geodesic convolution). Depending on interest, the project can lean toward visualization or other machine learning problems.
 +
 +**Prerequisites**:
 +  * Some exposure to differential geometry or calculus of curves and surfaces is a bonus.
 +  * Interest in geometry, optimization, or non-Euclidean ML.
 +
 +**References**:
 +  * Bécigneul, G., & Ganea, O.-E. “Riemannian adaptive optimization methods.” ICLR (2019).
 +  * Bronstein, Michael M., et al. "Geometric deep learning: Grids, groups, graphs, geodesics, and gauges." arXiv preprint arXiv:2104.13478 (2021).
 +  * Sanborn, Sophia, et al. "Beyond euclid: An illustrated guide to modern machine learning with geometric, topological, and algebraic structures." arXiv preprint arXiv:2407.09468 (2024).
  • mathematics_machine_learning_internship_summer_2025.1742996653.txt.gz
  • Last modified: 2025/03/26 13:44
  • by Diaaeldin Taha