Foundation of Deep Learning & LLMs @ CREF

Main Info:

📖 Course topics: Deep Learning, Transformers, and LLMs

👥 Audience: PhDs students, Postdocs, and permanents researchers @ CREF

⏰ Duration: 16 hours

Course Abstract

The course will begin with a review of the basics of Deep Learning in PyTorch, followed by an advanced algorithm such as the Variational Auto Encoder (VAEs). Then the Transformer architecture and its Self-attention Mechanism will be introduced and coded. A simple, small but complete autoregressive generative language model such as GPT-2 will be built. This will allow us to understand several relevant aspects of more sophisticated pre-trained LLMs, such as GPT4, Mistral or Llama.

Lecture Material