Mathematics of Deep Learning
BERLIN, 19 - 30 August 2019, at Zuse Institute Berlin
Deep learning (DL) methodologies are currently showing tremendous success in a variety of applications. In many cases, DL based methods outperform traditional approaches by far. At the same time, they lack rigorous mathematical foundation. Recently, various researchers of the mathematical community decided to start developing mathematical theories for DL from different angles, e.g., approaches focusing on analyzing the abstract approximation power of deep neural networks including approaches to understanding the convergence of the numerical minimization methods used in DL, or mathematical frameworks for convolutional neural networks such as the scattering transform approach, or the convolutional sparse coding approach, pushing a door open to the methodology of compressed sensing.
The summer school will offer lectures on both the theory of deep neural networks, on related questions such as generalization, expressivity, or explainability, as well as on applications of deep neural networks (e.g. to PDEs, inverse problems, or specific real-world problems).
The school consists of two weeks, where the first week is devoted to the theory of deep neural networks, and the second week has a focus on applications. The format is dominated by 1,5 hour lectures by international experts.
You can download the slides of the talks here once the speakers have provided them to the BMS: Summer School Talks (password protected area)
Leonid Berlyand (Penn State): PDE techniques in deep learning: convergence & stability of neural net classifiers
Taco Cohen (Qualcomm): learning of equivariant representations for data-efficient deep learning, medical imaging
Francois Fleuret (IDIAP, EPF Lausanne): statistical learning techniques mainly for computer vision
Eldad Haber (University of British Columbia): Using PDEs for designing stable DNN architectures
Robert Jenssen (Tromso): next generation machine learning data analytics methodology, health data analytics
Andreas Krause (ETH Zurich): Large-scale Machine Learning, Probabilistic Modeling and Inference, Sequential Decision Making, Crowds, Learning and Incentives
Gitta Kutyniok (TU Berlin): Theory of DL, Explainability, Applications to Inverse Problems
Ben Leimkuhler (U Edinburgh): MD, Bayesian parameterisation of complex models
Klaus-Robert Müller (TU Berlin): Machine Learning
Frank Noé (FU Berlin)
Christof Schütte (FU Berlin, ZIB): Applications of DL to MD
Vladimir Spokoiny (HU Berlin, WIAS)
René Vidal (Johns Hopkins University): Optimization and DNNs
The application deadline was 8 April 2019. There are no more free spots.
There is no registration fee.
Funding and Reimbursement
Participants who will receive partial funding from the BMS for their travel and/or accomodation costs will need to submit their original receipts and boarding passes to the organizers and provide the following info: home address, date of birth, bank account info. All payments will be made after the summer school has taken place. More detailed info will be provided via email in July.
How to get there
Berlin has two airports Tegel TXL and Schoenefeld SXF. TXL is closer to the city but both airports are easily accessible by public transport. You can find detailed information for your journey on this website: www.bvg.de/en
A weekly ticket for zones A and B costs 30€. The ZIB is located in zone B.
The Konrad Zuse Institute Berlin (ZIB) is located on the campus of the Freie Universität Berlin in Dahlem. It can be reached by U-Bahn U3 (station: Dahlem-Dorf) and bus X83 (station: Arnimallee). The bus X83 leaves at the U/S-Bahn station Rathaus Steglitz (U9, S1).
The ZIB is accessible through two foot paths from Arnimallee 6 and 10. Takustreet is closed due to construction. You can find a map here: https://www.zib.de/sites/default/files/page_attachments/Plan_new_0.png
Coffee breaks will be provided.