Large Language Models @ MHPC

This course is part of the Master in High-performance Computing held by SISSA & ICTP (Trieste).

Main Info:

šŸ“–Ā Course topics: Introduction to Large Language Model

šŸ‘„Ā Audience: Master and PhDs students of MHPC Master

ā°Ā Duration: 20 hours

Course Abstract

The course will begin with a review of the basics of Deep Learning in PyTorch, followed by an advanced algorithm such as the Variational Auto Encoder (VAEs). Then the Transformer architecture and its Self-attention Mechanism will be introduced and coded. A simple, small but complete autoregressive generative language model such as GPT-2 will be built. This will allow us to understand several relevant aspects of more sophisticated pre-trained LLMs, such as GPT4, Mistral or Llama.

Lecture Material