- 📚 Godoy, Daniel. Deep Learning with PyTorch - Step by Step. [Link]
- 📚 Tam, Adrian. Deep Learning with PyTorch. [Link]
- 📚 Cristina, Stefania; Saeed, Mehreen. Building Transformer with Attention. [Link]
- 📚 Huyen, Chip. Designing Machine Learning Systems. [Link]
- Detailed breakdown of the course structure and content, exploring various aspects and applications of Machine Learning.
- Motivation, Syllabus, and other issues.
- 🎉 GitHub Education Benefits
- GitHub Education Pro: Get access to the GitHub Education Pro pack by visiting GitHub Education
- 📖 Learning Resources
- GitHub Learning Game: Check out the interactive Git learning game at GitHub Learning Game
- Basic Python: Enhance your Python skills through the Kaggle Python course.
- AI Python for Beginners: Learn Python programming fundamentals and how to integrate AI tools for data manipulation, analysis, and visualization. Andrew Ng
- 📖 Learning Resources
Week 02: Machine Learning Fundamentals
- Motivation: how advances in Machine Learning are helping bridge the gap between AI's current capabilities and human cognitive abilities, highlighting limitations and future directions for AI systems.
- Overview of Machine Learning fundamentals, including an exploration of semi-supervised learning, active learning, and weak supervision.
- Discussion on Moravec's Paradox: examining the difference in cognitive complexity between tasks easily handled by AI versus tasks natural to humans.
- Self-supervised learning: Introduction to pretext tasks, where models are trained on unlabeled data, and their application in Natural Language Processing (NLP).
Key Concepts:
- Semi-supervised Learning: Training a model using both labeled and unlabeled data.
- Active Learning: A model that actively seeks human-labeled data for improved accuracy.
- Weak Supervision: Using weakly labeled data generated through heuristics or external knowledge sources.
- Self-Supervised Learning: Training models on pretext tasks to build representations from unlabeled data, with applications in NLP.
Week 03: Visualizing Gradient Descent
- In this week's lesson, we explore the Gradient Descent algorithm, a fundamental method for optimizing machine learning models. The focus is on understanding how gradient descent works and its application in training a linear regression model. We also examine the use of PyTorch for implementing these concepts, visualizing the steps, and critically evaluating key aspects of gradient-based optimization.
Week 04: Rethinking the training loop: a simple classification problem
- Rethinking the training loop:
- build a function to perform training steps, implement our own dataset class, use data loaders to generate mini-batches
- build a function to perform mini-batch gradient descent, evaluate our model
- save / checkpoint our model to disk
- load our model from disk to resume training or to deploy
- Going Classy:
- define a class to handle model training
- implement the constructor method
- understand the difference between public, protected, and private methods of a class
- integrate the code we’ve developed so far into our class
- instantiate our class and use it to run a classy pipeline
- A simple classification problem:
- build a model for binary classification
- understand the concept of logits and how it is related to probabilities
- use binary cross-entropy loss to train a model
- use the loss function to handle imbalanced datasets
- understand the concepts of decision boundary and separability
Week 05: Machine Learning and Computer Vision - Part I
- From a shallow to a deeep-ish clasification model:
- data generation for image classification
- transformations using torchvision
- dataset preparation techniques
- building and training logistic regression and deep neural network models using PyTorch
- focusing on various activation functions like Sigmoid, Tanh, and ReLU
Week 06: Machine Learning and Computer Vision - Part II
- Kernel
- Convolutions:
- In this lesson, we’ve introduced convolutions and related concepts and built a convolutional neural network to tackle a multiclass classification problem.
- Activation function, pooling layer, flattening, Lenet-5
- Softmax, cross-entropy
- Visualizing the convolutional filters, features maps and classifier layers
- Hooks in Pytorch
- In this lesson, we’ve introduced convolutions and related concepts and built a convolutional neural network to tackle a multiclass classification problem.