Esquema Detallado del Curso
Introduction
- Meet the instructor.
- Create an account at courses.nvidia.com/join
From Deep Learning to Large Language Models
- Learn how large language models are structured and how to use them:
- Review deep learning- and class-based reasoning, and see how language modeling falls out of it.
- Discuss transformer architectures, interfaces, and intuitions, as well as how they scale up and alter to make state-of-the-art LLM solutions.
Specialized Encoder Models
- Learn how to look at the different task specifications:
- Explore cutting-edge HuggingFace encoder models.
- Use already-tuned models for interesting tasks such as token classification, sequence classification, range prediction, and zero-shot classification.
Encoder-Decoder Models for Seq2Seq
- Learn about forecasting LLMs for predicting unbounded sequences:
- Introduce a decoder component for autoregressive text generation.
- Discuss cross-attention for sequence-as-context formulations.
- Discuss general approaches for multi-task, zero-shot reasoning.
- Introduce multimodal formulation for sequences, and explore some examples.
Decoder Models for Text Generation
- Learn about decoder-only GPT-style models and how they can be specified and used:
- Explore when decoder-only is good, and talk about issues with the formation.
- Discuss model size, special deployment techniques, and considerations.
- Pull in some large text-generation models, and see how they work.
Stateful LLMs
- Learn how to elevate language models above stochastic parrots via context injection:
- Show off modern LLM composition techniques for history and state management.
- Discuss retrieval-augmented generation (RAG) for external environment access.
Assessment and Q&A
- Review key learnings.
- Take a code-based assessment to earn a certificate.