Course Overview
In this course, you learn about the different challenges that arise when productionizing generative AI-powered applications versus traditional ML. You will learn how to manage experimentation and tuning of your LLMs, then you will discuss how to deploy, test, and maintain your LLM-powered applications. Finally, you will discuss best practices for logging and monitoring your LLM-powered applications in production.
Moyens d'évaluation :
- Quiz pré-formation de vérification des connaissances (si applicable)
- Évaluations formatives pendant la formation, à travers les travaux pratiques réalisés sur les labs à l’issue de chaque module, QCM, mises en situation…
- Complétion par chaque participant d’un questionnaire et/ou questionnaire de positionnement en amont et à l’issue de la formation pour validation de l’acquisition des compétences
Who should attend
Developers and machine learning engineers who wish to operationalize Gen AI-based applications
Prerequisites
Completion of Introduction to Developer Efficiency with Gemini on Google Cloud (IDEGC) or equivalent knowledge.
Course Objectives
- Describe the challenges in productionizing applications using generative AI.
- Manage experimentation and evaluation for LLM-powered applications.
- Productionize LLM-powered applications.
- Implement logging and monitoring for LLM-powered applications.
Moyens Pédagogiques :