- You Are Doing Full Batch Gradient Descent Using The Entire Training Set Not Stochastic Gradient Descent Is It Necessa 1 (25.25 KiB) Viewed 54 times
You are doing full batch gradient descent using the entire training set (not stochastic gradient descent). Is it necessa
-
- Site Admin
- Posts: 899603
- Joined: Mon Aug 02, 2021 8:13 am
You are doing full batch gradient descent using the entire training set (not stochastic gradient descent). Is it necessa
You are doing full batch gradient descent using the entire training set (not stochastic gradient descent). Is it necessary to shuffle the training data? Justify your answer.