Welcome to the GitHub repository for the summer 2019 school on Deep Learning for Computational Mathematics. In this repository, you will find all of the necessary files for the Monday, Tuesday, and Wednesday tutorials that accompany this summer school.
The purpose of this summer school is to introduce students in applied and computational mathematics to neural networks and deep learning. It will feature three days of lectures and hands-on tutorials, and be followed by a one-day workshop showcasing current research directions and applications, in particular those relating to computational science and engineering. Lectures will cover the foundational mathematics of deep learning, and the tutorials will expose students to their practical implementation in standard software (TensorFlow) on a variety of tasks, including image classification, function approximation, and restoration/superresolution.
This workshop is aimed at students in applied mathematics or related areas. It assumes no prior knowledge of neural networks. Experience in calculus, linear algebra and analysis and numerical analysis are essential.
To check out this repository into your home directory on https://hdda2019.syzygy.ca/jupyter, please click the following link.
Monday's tutorial will focus on basics of using the jupyter hub interface, commands exposed to the user through magic keywords, python and software packages for data science, and conclude with a simple model of function approximation using Google's tensorflow software package.
- 01:30 pm - 03:00 pm: Tutorial - “Introduction to Neural Networks with tensorflow”
- 03:00 pm - 03:30 pm: Coffee break
- 03:30 pm - 05:00 pm: Tutorial - “Introduction to Neural Networks with tensorflow, continued”
Tuesday's tutorial will provide hands-on experience with some of the more complicated aspects of deep learning, as featured in the lectures. The goal of today's session is to reinforce some of the concepts and related challenges covered in today's lectures through several examples of applying deep learning for tasks in data science.
- 01:30 pm - 03:00 pm: Tutorial - “Data-driven Modelling with tensorflow, Part I”
- 03:00 pm - 03:30 pm: Coffee break
- 03:30 pm - 05:00 pm: Tutorial - “Data-driven Modelling with tensorflow, Part I, continued”
Wednesday's tutorials will continue with the theme of Tuesdays, covering some of the more interesting aspects of approximation with deep neural networks, issues with convergence, and methods of regularization.
- 01:30 pm - 03:00 pm: Tutorial - “Data-driven Modelling with tensorflow, Part II”
- 03:00 pm - 03:30 pm: Coffee break
- 03:30 pm - 05:00 pm: Tutorial - “Data-driven Modelling with tensorflow, Part II, continued"
Confrimed Speakers
Talk 1 - Max Libbrecht (SFU), Understanding human gene regulation using deep neural networks.
Talk 2 - Paul Tupper (SFU), Which Learning Algorithms Can Generalize Identity Effects to Novel Inputs?
Talk 3 - Aaron Berk (UBC), A deep learning approach to retinal fundus imaging.
Talk 4 - Ben Adcock (SFU), Instabilities in deep learning.
- 09:30 am - 10:15 am: Talk 1
- 10:15 am - 10:45 am: Coffee break
- 10:45 am - 11:30 am: Talk 2
- 11:30 am - 01:45 pm: Lunch break
- 01:45 pm - 02:30 pm: Talk 4
- 02:30 pm - 03:15 pm: Talk 5