Proseminar Differentiable Programming
Differentiable Programming is the idea that many programs describe differentiable functions and that their derivatives can be computed algorithmically through automatic differentiation. Combined with gradient-based optimization methods this is a powerful method to learn parameters, that generalizes approaches in machine learning from neural networks to arbitrary differentiable programs. In this seminar we take differentiable programming as a starting point to explore some of the related ideas in optimization and automatic differentiation, as well as various applications in, among others, control, physics, simulation and computer graphics.
The proseminar is divided into four areas. Optimization, automatic differentiation, applications and advanced topics. Below is a list of papers we intend to discuss from each of these areas. When signing up please select your area of interest. We will then randomly distribute the spots within each area and you will be assigned to research and conduct one of the talks from this area. The papers in advanced topics were chosen to be a serious challenge. Before signing up, please make sure you are happy to read them. Advanced topics are likely to require quite a bit of background reading and are expected to be a significantly bigger time committment than the rest. (Some more help from us can be expected too.)
Structure:
- Two initial meetings in the beginning of the semester with an introduction talk to differentiable programming and some useful tips for the seminar
- Towards the end of the semester (roughly last five weeks) weekly meetings with your presentations
- Two preparation meetings with one of the organizers. The first should be early once you have become more familiar with your topic and the second about two weeks before your talk to discuss your slides and handout.
Requirements:
- Giving a 25 min presentation on your topic to the seminar audience.
- Submitting a 2 page handout to your talk, which will be distributed to the seminar audience.
Language: The seminar will be held in English. If you are very interested but uncomfortable giving your presentation in English exceptions are possible.
List of Papers:
- Optimization
- Topic 1: Gradient Descent and Nesterov’s fast gradient descent
- Topic 2: Stochastic gradient descent Chapter 14
- Topic 3: ‘Adam: A Method for Stochastic Optimization’
- Automatic Differentiation
- Topic 4: Automatic Differentiation
- Topic 5: Implementations of AD
- Topic 6: ‘On Correctness of Automatic Differentiation for Non-Differentiable Functions’
- Applications
- Topic 7: ‘Differentiable MPC for End-to-end Planning and Control’
- Topic 8: ‘SATNet: Bridging deep learning and logical reasoning using a differentiable satisfiability solver’
- Topic 9: ‘Differentiable Monte Carlo ray tracing through edge sampling’
- Topic 10: ‘Learning to Control PDEs with Differentiable Physics
- Advanced Topics
- Topic 11: ‘Reverse-mode AD in a functional framework: Lambda the ultimate backpropagator’
- Topic 12: ‘``(\partial) is for Dialectica’
Dates
- Di, 18.04.2023 15:45 - 17:15 50.34 Raum 301
- Di, 02.05.2023 15:45 - 17:15 50.34 Raum 301
- Mi, 05.07.2023 14:00 - 15:30 50.34 Raum 148
- Mi, 12.07.2023 14:00 - 15:30 50.34 Raum 148
- Mi, 26.07.2023 14:00 - 15:30 50.34 Raum 148