I am collecting here a series of (guided) exercises, implemented in Jupyter notebooks, to explore several topics in deep learning, from automatic differentiation to explainability. This is a work in progress that will take a lot of time, but most notebooks can already be used in a standalone form.

<aside> 📌

Most exercises are relatively simple, so I suggest removing any code autocomplete features before starting. 😅 In Google Colab Settings >> AI Assistance >> Disable inline completions. You can easily re-turn them on for suggestions or solutions.

</aside>

Basic notebooks

All notebooks in this category are assumed essential and will not be listed below.

ID Name Libraries Preliminaries
Py01 Introduction to Python - -
NP01 Introduction to NumPy NumPy -
NP02 Linear regression in NumPy NumPy NP01
NP03 k-NN in NumPy NumPy NP01
SK01 Introduction to scikit-learn scikit-learn

PyTorch

These are didactic notebooks for learning PyTorch.

ID Name Libraries Preliminaries
PT01 Introduction to PyTorch PyTorch -
PT02 Logistic regression from scratch PyTorch PT01
PT03 Convolutional networks from scratch PyTorch PT01
PT04 Building a Vision Transformer PyTorch, PyTorch Lightning PT01

JAX

These are roughly equivalent to the PyTorch ones, but in JAX (and corresponding high-level libraries, such as Equinox).

ID Name Libraries Preliminaries
JAX01 Logistic regression in JAX JAX -
JAX02 CNNs in Keras and JAX Keras, JAX JAX01
JAX03 Introduction to Equinox Equinox, JAX JAX01
JAX04 Text classification in Equinox Equinox, Optax, JAX JAX01, JAX02

Automatic differentiation

ID Name Libraries Preliminaries
AD01 Operator overloading NumPy PT02

Post-training

Any method that acts on one or more pre-trained models.

ID Name Libraries Preliminaries
M01 Model merging PyTorch PT04

Explainability

ID Name Libraries Preliminaries
XAI01 Saliency maps PyTorch, Captum PT04
XAI02 Data attribution PyTorch PT04