Why AI Needs More Inference Compute: Introducing The Cerebras Scaling Law by Sean Lie, CTO Cerebras
Автор: Cerebras
Загружено: 2025-05-20
Просмотров: 35965
Описание:
Cerebras co-founder and CTO Sean Lie explains the next frontier in AI: inference-time compute. As pretraining scaling hits limits due to data scarcity and unsustainable compute, the focus is shifting to inference. Learn how models like Qwen3 show that more tokens at inference = more intelligence, and why this marks a new era in AI development.
Watch for GPQA benchmarks, real-world implications, and a fresh perspective on the future of AI scaling.
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: