From Prompt Engineering to Prompt Optimization in Production LLM Systems
Автор: deepsense
Загружено: 2026-02-03
Просмотров: 17
Описание:
In this AI Tech Experts Webinar, Julia May (ML Engineer), explains why prompts should be treated as hyperparameters and how automated prompt optimization can improve LLM performance in production systems.
The talk compares manual vs automatic prompt engineering, then walks through three popular data-driven optimization approaches inspired by recent research:
📌 using LLMs as prompt engineers,
📌 iterative prompt editing and search,
📌 meta-prompting, where LLMs act as general optimizers.
Julia then moves to practical prompt optimization, showing why prompts should be optimized like ML models — with datasets, metrics, versioning and evaluation. She discusses common failure modes, explains why DSPy may not fit every use case, and demonstrates how coding tools like Cursor can be used for fast, practical prompt optimization with measurable gains.
If you have questions for Julia, feel free to ask them in the comments and continue the discussion there 💬
01:40 Manual vs automatic prompt engineering
04:44 Prompt optimization techniques
10:53 Practical prompt optimization workflow
13:38 Failure modes and how to fix them
17:40 DSPy tradeoffs + Cursor demo and results
Check our website: https://deepsense.ai/
Linkedin: / applied-ai-insider
#PromptEngineering #PromptOptimization #LLMs #AIEngineering
#MachineLearning #MLOps #NLP #ProductionAI
#ScalableAI #LLMSystems
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: