Hugging Face Releases SmolLM3: A 3B Long-Context, Multilingual Reasoning Model
Автор: Marktechpost AI
Загружено: 2025-07-08
Просмотров: 347
Описание:
Hugging Face has introduced SmolLM3, a compact yet powerful 3B-parameter decoder-only language model trained on 11.2 trillion tokens. Designed for efficiency, it supports dual-mode reasoning ("think" and "no_think"), multilingual generation across six languages, and long-context processing up to 128K tokens using YaRN scaling. With strong tool-call capabilities and open engineering transparency—including compatibility with vLLM, ONNX, and GGUF formats—SmolLM3 outperforms comparable 3B models and rivals some 4B models in reasoning and real-world task benchmarks. It’s optimized for RAG systems, agent frameworks, and on-device deployments, all under an Apache 2.0 license.
Full Analysis: https://www.marktechpost.com/2025/07/...
SmolLM3-3B-Base: https://huggingface.co/HuggingFaceTB/...
SmolLM3-3B-Instruct: https://huggingface.co/HuggingFaceTB/...
If you like this video, please subscribe to our AI Newsletter: https://www.airesearchinsights.com/
@HuggingFace #artificialintelligence #opensource #ai #languagemodels
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: