Suppose you use Batch Gradient Descent to train a neural network and you plot the training error at every epoch. If you notice that the training error consistently goes up, what is likely going on? How can you fix this?
+2
Answers (1)
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Suppose you use Batch Gradient Descent to train a neural network and you plot the training error at every epoch. If you notice that the ...” in 📗 Computers & Technology if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Home » Computers & Technology » Suppose you use Batch Gradient Descent to train a neural network and you plot the training error at every epoch. If you notice that the training error consistently goes up, what is likely going on? How can you fix this?