Run Magistral Locally on Ollama - Agentic Tool for code generation and specific tasks
Автор: roseindiatutorials
Загружено: 2025-06-11
Просмотров: 100
Описание:
Run Magistral Locally on Ollama - Agentic Tool for code generation and specific tasks. In this video we are going to learn to run Magistral LLMs reasoning model on ollama and then generate python code. I have used following prompt:
Write a Python program that displays multiple balls bouncing inside multiple spinning polygons, with collisions against the borders. Each ball should have a different color, and each polygon should have a different number of sides. The balls should be affected by gravity and frictions.
Generate me pip install commands to install all required Python libraries to run the program.
Share us your results in the comments section.
Check more tutorials at https://www.roseindia.net
#Magistral #agenticmodel #agenticai #ollama
Повторяем попытку...

Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: