How to run Deepseek locally with Ollama
Автор: House of Logic blog
Загружено: 2025-02-19
Просмотров: 144
Описание:
How to Setup DeepSeek locally with Ollama
This video shows how to setup DeepSeek r1, specifically the 1.5b model, in a local self hosted environment, using Ubuntu running on a virtual machine.
The same principles apply to the DeepSeek-R1 671b model, but as this is 400+GB, it was too big for the VM being used.
Note: The performance seen is not ideal, but this was an experiment to see if DeepSeek can run in a local virtualisation environment, and what the set up process looks like as it will be the same with better specification hardware.
Part 2 will cover the Open-WebUI that can be used to query the AI.
Chapters in this video:
00:00 Intro
00:49 Start of Setup
01:27 Ollama installation complete
02:10 Testing Ollama with curl
02:32 Configuring Ollama to run as a service
03:43 Testing Ollama is running with a browser
04:06 Pulling down the Deepseek model
05:03 Sending a prompt to the Deepseek model in Ollama with Postman
06:05 Performance of VM during execution
06:36 Response from DeepSeek AI
06:54 Outro
Useful Links:
https://github.com/HouseOfLogicGH/Oll...
Follow @HouseofLogicBlog or visit https://www.houseoflogic.co.uk for more tutorials and articles.
Повторяем попытку...

Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: