Skip to content

taldatech/ee046211-deep-learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

91 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ee046211-deep-learning


Technion ECE 046211 - Deep Learning

Tal DanielDaniel Soudry

Jupyter Notebook tutorials for the Technion's ECE 046211 course "Deep Learning"

Open In Colab Open In NBViewer Open In Binder

Student Projects WebsiteVideo Tutorials (Winter 2024)

Agenda

File Topics Covered Video
Setting Up The Working Environment.pdf Guide for installing Anaconda locally with Python 3 and PyTorch, integration with PyCharm and using GPU on Google Colab -
ee046211_tutorial_01_
machine_learning_recap.ipynb/pdf
Supervised and Unsupervised Learning, Model Evaluation, Bias-Variance Tradeoff, Feature Scaling, Linear Regression, Gradient Descent, Regularization (Ridge, LASSO) Video Link
ee046211_tutorial_02_
single_neuron_recap.ipynb/pdf
Discriminative models, Perceptron, Logistic Regression (also in PyTorch), Softmax Regression, Activation functions Video Link
ee046211_tutorial_03_
optimization_gradient_descent.ipynb/pdf
Unimodal functions, Convexity, Hessain, Gradient Descent, SGD, Learning Rate, LR Scheculing / Annealing, Momentum, Nesterov Momentum, Adaptive Learning Rate Methods, Adagrad, RMSprop, Adam, AdaBelief, MADGRAD, Adan, Schedule-free Optimization (SGD, Adam) Video Link - Part 1

Video Link - Part 2
ee046211_tutorial_04_
differentiation_autograd.ipynb/pdf
Lagrange Multipliers, Automatic Differentiation (AutoDiff) Forward Mode and Reverese Mode, PyTorch Autograd Video Link
ee046211_tutorial_05_
multilayer_nn.ipynb/pdf
Multi-Layer Perceptron (MLP), Backpropagation, Neural Netwroks in PyTorch, Weights Initialization - Xavier (Glorot), Kaiming (He), Deep Double Descent Video Link
ee046211_tutorial_06_
convnets_visual_tasks.ipynb/pdf
2D Convolution (Cross-correlation), Convolution-based Classification, Convolutional Neural Networks (CNNs), Regularization and Overfitting, Dropout, Data Augmentation, CIFAR-10 dataset, Visualizing Filters, Applications of CNNs, The problems with CNNs (adversarial attacks, poor generalization, fairness-undesirable biases) Video Link - Part 1

Video Link - Part 2
ee046211_tutorial_07_
sequential_tasks_rnn.ipynb/pdf
Sequential Tasks, Natural Language Processing (NLP), Language Model, Perplexity, BLEU, Recurrent Neural Network (RNN), Backpropagation Through Time (BPTT), Long Term Short Memory (LSTM), Gated Recurrent Unit (GRU), RWKV, xLSTM, Multi-head Self-Attention, Transformer, BERT and GPT, Teacher Forcing, torchtext, Sentiment Analysis, Transformers Warmup, Intialization, GLU variants, Pre-norm and Post-norm, RMSNorm, SandwichNorm, ReZero, Rectified Adam (RAdam), Relative Positional Encoding/Embedding Video Link - Part 1

Video Link - Part 2

Video Link - Part 3
ee046211_tutorial_08_
training_methods.ipynb/pdf
Feature Scaling, Normalization, Standardization, Batch Normalization, Layer Normalization, Instance Normalization, Group Normalization, Vanishing Gradients, Exploding Gradients, Skip-Connection, Residual Nlock, ResNet, DenseNet, U-Net, Hyper-parameter Tuning: Grid Search, Random Search, Bayesian Tuning, Optuna with PyTorch Video Link

Video Link - Optuna Tutorial
ee046211_tutorial_09_
self_supervised_representation_learning.ipynb/pdf
Transfer Learning, Domain Adaptation, Pre-trained Networks, Sim2Real, BERT, Low-rank Adaptation - LoRA, DoRA, Representation Learning, Self-Supervised Learning, Autoencoders, Contrastive Learning, Contrastive Predictive Coding (CPC), Simple Framework for Contrastive Learning of Visual Representations (SimCLR), Momentum Contrast (MoCo), Bootstrap Your Own Latent (BYOL), DINO, CLIP Video Link - Part 1 - Transfer Learning

Video Link - Part 2 - Self-supervised Learning
ee046211_tutorial_10_
compression_pruning_amp.ipynb/pdf
Resource Efficiency in DL, Automatic Mixed Precision (AMP), Quantization (Dynamic, Static), Quantization Aware Training (QAT), LLM Quantization, Pruning, The Lottery Ticket Hypothesis Video Link
pytorch_maximize_cpu_gpu_utilization.ipynb/pdf Tips and Tricks for efficient coding in PyTorch, Maximizing the CPU and GPU utilization, nvidia-smi, PyTorch Profiler, AMP, Multi-GPU training, HF Accelerate, RL libraries Video Link

Running The Notebooks

You can view the tutorials online or download and run locally.

Running Online

Service Usage
Jupyter Nbviewer Render and view the notebooks (can not edit)
Binder Render, view and edit the notebooks (limited time)
Google Colab Render, view, edit and save the notebooks to Google Drive (limited time)

Jupyter Nbviewer:

nbviewer

Press on the "Open in Colab" button below to use Google Colab:

Open In Colab

Or press on the "launch binder" button below to launch in Binder:

Binder

Note: creating the Binder instance takes about ~5-10 minutes, so be patient

Running Locally

Press "Download ZIP" under the green button Clone or download or use git to clone the repository using the following command: git clone https://github.com/taldatech/ee046211-deep-learning.git (in cmd/PowerShell in Windows or in the Terminal in Linux/Mac)

Open the folder in Jupyter Notebook (it is recommended to use Anaconda). Installation instructions can be found in Setting Up The Working Environment.pdf.

Installation Instructions

For the complete guide, with step-by-step images, please consult Setting Up The Working Environment.pdf

  1. Get Anaconda with Python 3, follow the instructions according to your OS (Windows/Mac/Linux) at: https://www.anaconda.com/download
  2. Install the basic packages using the provided environment.yml file by running: conda env create -f environment.yml which will create a new conda environment named deep_learn. If you did this, you will only need to install PyTorch, see the table below.
  3. Alternatively, you can create a new environment for the course and install packages from scratch: In Windows open Anaconda Prompt from the start menu, in Mac/Linux open the terminal and run conda create --name deep_learn. Full guide at https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#creating-an-environment-with-commands
  4. To activate the environment, open the terminal (or Anaconda Prompt in Windows) and run conda activate deep_learn
  5. Install the required libraries according to the table below (to search for a specific library and the corresponding command you can also look at https://anaconda.org/)

Libraries to Install

Library Command to Run
Jupyter Notebook conda install -c conda-forge notebook
numpy conda install -c conda-forge numpy
matplotlib conda install -c conda-forge matplotlib
pandas conda install -c conda-forge pandas
scipy conda install -c anaconda scipy
scikit-learn conda install -c conda-forge scikit-learn
seaborn conda install -c conda-forge seaborn
tqdm conda install -c conda-forge tqdm
opencv conda install -c conda-forge opencv
optuna pip install optuna
pytorch (cpu) conda install pytorch torchvision torchaudio cpuonly -c pytorch (get command from PyTorch.org)
pytorch (gpu) conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia (get command from PyTorch.org)
torchtext conda install -c pytorch torchtext
torchdata conda install -c pytorch torchdata + pip install portalocker
  1. To open the notebooks, open Ananconda Navigator or run jupyter notebook in the terminal (or Anaconda Prompt in Windows) while the deep_learn environment is activated.

Releases

No releases published

Packages

No packages published