Rapid Application Development Using Large Language Models (RADLLM)
Starting dates and places
placeFrankfurt 10 Mar 2025 |
placeBerlin 1 Sep 2025 |
Description
Voraussetzungen
- Introductory deep learning, with comfort with PyTorch and transfer learning preferred. Content covered by DLI’s Getting Started with Deep Learning or Fundamentals of Deep Learning courses, or similar experience is sufficient.
- Intermediate Python experience, including object-oriented programming and libraries. Content covered by Python Tutorial (w3schools.com) or similar experience is sufficient.
Detaillierter Kursinhalt
Introduction
- Meet the instructor.
- Create an account at courses.nvidia.com/join
From Deep Learning to Large Language Models
- Learn how large language models are structured and how to use them:
- Review deep learning- and class-based reasoning, and see how…
Frequently asked questions
There are no frequently asked questions yet. If you have any more questions or need help, contact our customer service.
Voraussetzungen
- Introductory deep learning, with comfort with PyTorch and transfer learning preferred. Content covered by DLI’s Getting Started with Deep Learning or Fundamentals of Deep Learning courses, or similar experience is sufficient.
- Intermediate Python experience, including object-oriented programming and libraries. Content covered by Python Tutorial (w3schools.com) or similar experience is sufficient.
Detaillierter Kursinhalt
Introduction
- Meet the instructor.
- Create an account at courses.nvidia.com/join
From Deep Learning to Large Language Models
- Learn how large language models are structured and how to use
them:
- Review deep learning- and class-based reasoning, and see how language modeling falls out of it.
- Discuss transformer architectures, interfaces, and intuitions, as well as how they scale up and alter to make state-of-the-art LLM solutions.
Specialized Encoder Models
- Learn how to look at the different task specifications:
- Explore cutting-edge HuggingFace encoder models.
- Use already-tuned models for interesting tasks such as token classification, sequence classification, range prediction, and zero-shot classification.
Encoder-Decoder Models for Seq2Seq
- Learn about forecasting LLMs for predicting unbounded
sequences:
- Introduce a decoder component for autoregressive text generation.
- Discuss cross-attention for sequence-as-context formulations.
- Discuss general approaches for multi-task, zero-shot reasoning.
- Introduce multimodal formulation for sequences, and explore some examples.
Decoder Models for Text Generation
- Learn about decoder-only GPT-style models and how they can be
specified and used:
- Explore when decoder-only is good, and talk about issues with the formation.
- Discuss model size, special deployment techniques, and considerations.
- Pull in some large text-generation models, and see how they work.
Stateful LLMs
- Learn how to elevate language models above stochastic parrots
via context injection:
- Show off modern LLM composition techniques for history and state management.
- Discuss retrieval-augmented generation (RAG) for external environment access.
Assessment and Q&A
- Review key learnings.
- Take a code-based assessment to earn a certificate.
Share your review
Do you have experience with this course? Submit your review and help other people make the right choice. As a thank you for your effort we will donate $1.- to Stichting Edukans.There are no frequently asked questions yet. If you have any more questions or need help, contact our customer service.