Efficient Large Language Model (LLM) Customization (ELLMC)

 

Resumen del Curso

Enterprises need to execute language-related tasks daily, such as text classification, content generation, sentiment analysis, and customer chat support, and they seek to do so in the most cost-effective way. Large language models can automate these tasks, and efficient LLM customization techniques can increase a model’s capabilities and reduce the size of models required for use in enterprise applications. In this course, you'll go beyond prompt engineering LLMs and learn a variety of techniques to efficiently customize pretrained LLMs for your specific use cases—without engaging in the computationally intensive and expensive process of pretraining your own model or fine-tuning a model's internal weights. Using NVIDIA NeMo™ service, you’ll learn various parameter-efficient fine-tuning methods to customize LLM behavior for your organization.

Certificaciones

Prerrequisitos

  • Professional experience with the Python programming language.
  • Familiarity with fundamental deep learning topics like model architecture, training and inference.
  • Familiarity with a modern Python-based deep learning framework (PyTorch preferred).
  • Familiarity working with out-of-the-box pretrained LLMs.

Objetivos del curso

By the time you complete this course you will be able to:

  • Apply parameter-efficient fine-tuning techniques with limited data to accomplish tasks specific to your use cases
  • Use LLMs to create synthetic data in the service of fine-tuning smaller LLMs to perform a desired task
  • Drive down model size requirements through a virtuous cycle of combining synthetic data generation and model customization.
  • Build a generative application composed of multiple customized models you generate data for and create throughout the workshop.

Precios & Delivery methods

Entrenamiento en línea

Duración
1 día

Precio
  • Consulta precio y disponibilidad
Classroom training

Duración
1 día

Precio
  • Consulta precio y disponibilidad

Calendario

Por el momento no hay fechas programadas para este curso