Session 6 – Visualizing Memory in RNNs, GRUs, and LSTMs
Автор: Hamid Sadeghi
Загружено: 2026-01-02
Просмотров: 12
Описание:
In this session, we open up the “black box” of RNNs and finally see how they think.
You’ll watch hidden states evolve word by word, understand why RNNs forget, and see how GRU and LSTM fix the vanishing‑gradient problem with gates that remember what matters and erase what doesn’t.
We break down:
How RNN hidden states change at each timestep
Why vanishing gradients make RNNs fail on long sequences
How GRU update/reset gates decide word importance
How LSTM cell state + hidden state solve long‑term memory
Real numeric examples showing exactly what the model keeps, forgets, and updates
A simple sequence (“Hey I need help”) analyzed unit by unit
By the end, you’ll understand not just what GRU and LSTM do — but why they were invented and how they fix the core limitations of vanilla RNNs.
Perfect for beginners, teens, and anyone who wants to truly understand sequence models instead of memorizing diagrams.
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: