Page 1 of 1

You are doing full batch gradient descent using the entire training set (not stochastic gradient descent). Is it necessa

Posted: Fri Jul 01, 2022 5:39 am
by answerhappygod
You Are Doing Full Batch Gradient Descent Using The Entire Training Set Not Stochastic Gradient Descent Is It Necessa 1
You Are Doing Full Batch Gradient Descent Using The Entire Training Set Not Stochastic Gradient Descent Is It Necessa 1 (25.25 KiB) Viewed 56 times
You are doing full batch gradient descent using the entire training set (not stochastic gradient descent). Is it necessary to shuffle the training data? Justify your answer.