Apple's Xcode 26.3: Run Local Models in Claude Agent
Автор: Rudrank Riyam
Загружено: 2026-02-04
Просмотров: 416
Описание:
In this video I show you how to point the Claude Code instance in Xcode to a local model running via LM Studio.
I walk through the full setup: configuring LM Studio with the Anthropic-compatible endpoint, setting the context length tokens (minimum required), and editing the settings.json file under Library/Developer/Xcode/CodingAssistant/ClaudeAgentConfig.
I test it offline with the OpenAI GPT OSS 20B model. Spoiler: it works but it's slow on my M5 - if you have a maxed out M4 Max with 128GB RAM you'll have a much better experience. Still, proof of concept for token maxing on a plane with no WiFi.
---
Want to build AI-powered iOS apps? Check out my books:
Exploring AI for iOS Development: https://academy.rudrank.com/product/ai
Exploring AI-Assisted Coding for iOS Development: https://academy.rudrank.com/product/a...
---
Support my work by checking out these great apps (Affiliate Links):
Astro ASO Tool: https://tryastro.app/?aff=6NqZ6
Screen Studio: https://screen.studio/?aff=6NqZ6
---
Follow my journey: https://x.com/rudrankriyam
---
*Timestamps:*
00:00 - Using local models in Xcode 26.3's Claude Code
00:22 - Why LM Studio (Anthropic-compatible endpoint)
00:41 - Use case: coding on a plane with no internet
01:19 - Model options: GPT OSS 20B vs GLM 4.7 Flash
02:20 - Setting context length to 131k tokens
02:44 - Configuring settings.json (base URL, token, model)
03:22 - Verifying Xcode is not signed into Anthropic
03:46 - Testing offline with internet turned off
04:04 - First prompt: watching MCP tools load
05:01 - Response: "Hello, how can I help you"
05:28 - Second prompt: audit current file
06:22 - Reality check: slow on current hardware
07:25 - Future potential with better Macs and models
08:03 - Wrap up
#Xcode #LocalLLM #LMStudio #ClaudeCode #iOSDev #OfflineCoding
---
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: