The Post-Transformer Era: AI's Next Frontier | NYU x Pathway
Автор: Pathway (pathway.com)
Загружено: 2026-02-07
Просмотров: 1351
Описание:
What comes after Transformers?
The Transformer architecture behind GPT has dominated AI for nearly a decade. But cracks are showing. Transformer-based models have no continuous learning (frozen in time, like Groundhog Day), limited context windows, and compute costs that spiral as reasoning gets longer.
What would it take for the next generation of frontier models to do long-horizon reasoning? To learn continuously? Generalize from experience? Join leading researchers from NYU Tandon and Pathway as they explore where AI is headed, from the algorithmic foundations to emerging architectures challenging the Transformer's dominance.
🔒 This video link is shared with registered attendees. Questions can be submitted in advance or asked live during the session.
👤 SPEAKERS
Martín Farach-Colton — Chair, CSE, NYU Tandon | ACM/IEEE/SIAM Fellow
Julian Togelius — Professor, NYU Tandon | Head of AI, Nof1 | Author of "Artificial General Intelligence" (MIT Press)
Adrian Kosowski — CSO & Co-Founder, Pathway | PhD at 20, 100+ papers
Zuzanna Stamirowska (Moderator) — CEO & Co-Founder, Pathway | PhD in Complexity Science
📌 TOPICS COVERED
– Why Transformer-based models hit a wall: memory limitations preventing long-context reasoning, losing coherence rapidly, and a fundamental scaling wall despite massive investment
– Emerging paradigms: sparse activations, Hebbian plasticity, and brain-inspired architectures
– The path to continual learning and interpretable AI
– Live Q&A with frontier researchers from New York University and Pathway
📅 WHEN
February 6, 2026, 11:00 AM ET / 9:30 PM IST (after class hours)
Organized by lead researchers from NYU Tandon and Pathway, in collaboration with Technical Councils at select IITs.
🏷️ TAGS
#PostTransformer #AI #MachineLearning #NYU #Pathway #FutureOfAI #ContinualLearning #AIResearch
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: