Epochs, Iterations and Batch Size | Deep Learning Basics
Автор: Galaxy Inferno Codes
Загружено: 2021-08-26
Просмотров: 58170
Описание:
Epoch, Iteration, Batch Size?? What does all of that mean and how do they impact training of neural networks?
I describe all of this in this video and I also go into some details of how Gradient Descent differs from Stochastic Gradient Descent in terms of training your neural network.
TIMESTAMPS:
0:00 Intro & Training Cycle
0:58 Iteration
2:04 Epoch
3:06 Full batch GD
4:27 Mini Batch SGD pros & cons
6:41 Conclusion
Subscribe for more content on Deep Learning and Machine Learning from a Data Science Consultant and to learn along with me :))
---------
You can also find me on Instagram, where I post almost daily:
/ galaxyinferno.codes
And on my blog:
https://galaxyinferno.com/
Повторяем попытку...

Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: