Python Machine Learning From Scratch Model Parameter Improvements
Автор: Stephen Blum
Загружено: 2025-12-18
Просмотров: 36
Описание:
Yesterday, we made a batching algorithm for our deep learning model, and honestly, it worked better than I expected with just a few quick code changes. We used a random shuffle method, kind of like stochastic gradient descent, to help the model improve its weights and biases using small batches of inputs and targets. But while the algorithm works pretty well, it is not perfect because the randomness can make it so that some data points get left out or re-used unevenly during training, which means it does not always match a true epoch, where every data point is used exactly once per training cycle.
An epoch is just one full pass over the whole dataset, and repeating epochs helps the model learn, like practicing something over and over makes you remember it better. Our current method does not always do that because it could miss some points, so it is not a true epoch. To fix this, we want to shuffle the dataset first, then break it into batches, so each data point is used once per epoch, and the batches are different each time.
Also, I want to clean up our code so we do not keep passing the learning rate everywhere, and instead set it in the model itself using self.learn. We are building all this from scratch with just NumPy, not using Keras or any other handy libraries, because doing it by hand really helps to understand how deep learning works. We have made good progress but still want to improve the batching to make training smarter and fix how we set the learning rate, because if we update these, our hand-built neural network will work even better.
Thanks for all the likes and for following on this project.
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: