How to Run any open source LLM locally using Ollama + docker | Ollama Local API (Tinyllama) | Easy
Автор: Kode Karbon
Загружено: 2024-07-30
Просмотров: 3234
Описание:
#ollama #dockertutorial #stepbystep
Welcome to Kode Karbon! In this tutorial, we'll dive into the world of running Large Language Models (LLMs) locally using #Docker and #Ollama. Perfect for those looking to develop, prototype, or test LLMs efficiently on their local machine.
This a very basic tutorial.
Part 2 : Run Ollama Models in Google Collab
Github Link :-
.
Time Stamps:
00:00 Intro Part 1
----------------------------------------------------------------------------------------------------------------------------------------------------
If you enjoyed this video, be sure to press the 👍 button so that I know what content you guys like to see.
----------------------------------------------------------------------------------------------------------------------------------------------------
Do you want to learn 📚 Machine Learning Technology from me? Then Subscribe 🔴 and keep an eye on the channel for more in depth tutorial videos.
----------------------------------------------------------------------------------------------------------------------------------------------------
#️⃣ Social Media #️⃣
📞 Connect with Me:
🌍 My Website: https://thoufeekx.vercel.app/
🤖 GitHub: https://github.com/thoufeekx/
👉 LinkedIn: / thoufeekx
📷 Instagram: @kodekarbon
✖️ Twitter: https://
Tags:
#stepbystep #docker #ollama #llms #localdevelopment #dockercontainer #dockerhub #languagemodels #machinelearning #kodekarbon #tutorial
Повторяем попытку...

Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: