Programme outline
Learning objectives
- Apply the TDIOML (task, dataset, input, output, model, loss) framework to define a typical machine learning problem.
- Implement neural networks using the PyTorch framework, to simplify the development of said models.
- Understand and implement the forward and backward propagation mechanisms that are used to evaluate and train neural networks.
- Tune neural networks by implementing advanced initialisers, optimisers, loss functions and regularisation.
- Understand the concepts of generalisation, over- and underfitting, exploding and vanishing gradients.
- Implement good professional practices for training and evaluating deep learning models, such as train/test/validation sets, saver/loader functions, etc.
Day 1
- Introduction of course, reminders of mathematics (linear algebra, probabilities, machine learning)
- Bringing all students up to speed on the mathematical operations that are required for this class. Linking up with the previous machine learning course, full implementation in NumPy. Introducing the softmax function and linking up with the linear regression model.
- Overfitting, underfitting, generatisation and ways to control it
- Recognising over- and underfitting, losses and regularisation on losses.
- Implementing our first Shallow Neural Network
- Understanding the BackProp method and how it is used to train models. Implementation of a BackProp in Numpy for our Shallow Neural Network.
- The backpropagation algorithm and advanced optimisers
- The idea behind BackProp and its implementation in Numpy for our Shallow Neural Network. The No Free Lunch theorem and the Universal Approximation theorem. The AdaGrad, RMSProp and Adam optimisers, and their implementation in Numpy. The Stochastic Gradient Descent algorithm. Combining concepts into great optimisers.
Day 2
- Breaking linearity and symmetries, other issues with Neural Networks. Good practices in deep learning
- The exploding/vanishing gradient problem. Using activation functions to break linearity. Using initialisers to break symmetries. Using additional performance metrics, implementing a saver/loader function for reproducibility, implementing an early stopping.
- The PyTorch library and the Tensor object
- Introducing the PyTorch library and Tensor datatype. Basic operations on tensors. Practice. Using PyTorch to rewrite our Shallow Neural Network and its forward method.
- Adding more features to our PyTorch Shallow Neural Network and using dataloaders
- Loss functions, backpropagation, optimisers, initialisers and regularisation in PyTorch. Writing custom dataset and dataloader objects for any data science project. Demonstration on MNIST for multi-class classification.
- Quiz and lab session for implementing Shallow Neural Networks in PyTorch
- Lab session and assessment.
Mode of assessment
- Class participation, 10%
- Lab sessions, 60%
- Quizzes, 30%