Run AI Simulated People Locally with Ollama! (Qwen2.5 & TinyTroupe)
Автор: Bijan Bowen
Загружено: 2024-11-20
Просмотров: 2632
Описание:
Timestamps:
00:00 Intro
00:30 Demo
02:55 Technical Demo
07:59 Changing Personality
10:52 Technical Speak
12:38 Functioning Demo
15:30 Outro
🎥 TinyTroupe Fork: Running Local LLMs with Ollama Integration! 🌟
Welcome to this walkthrough of the TinyTroupe repo fork, revamped to integrate Ollama and leverage powerful local LLM models! 🚀
🔍 What You'll Learn in This Video:
How to adapt the TinyTroupe library to work with Ollama's API.
Key modifications made to support local LLMs alongside traditional cloud-based solutions.
Step-by-step guide to updating scripts like utils.py and openai_utils.py for seamless compatibility.
Testing conversational simulations with custom prompts and multi-shot chaining.
Tips and tricks to maximize performance with local AI models for cost-efficient, privacy-focused solutions.
💬 Questions or Ideas? Drop a comment below! Let us know how you’re using TinyTroupe and local LLMs in your projects.
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: