Ask Question

Suppose you use Batch Gradient Descent to train a neural network and you plot the training error at every epoch. If you notice that the training error consistently goes up, what is likely going on? How can you fix this?

+2
Answers (1)
  1. 30 January, 18:01
    0
    The answer is "using validation error".

    Explanation:

    The validation error is used to response the test for one of the queries is activated to the participant, which may not properly answer the question. These errors go up continuously after each time, the processing rate is too high and also the method is different.

    These errors are also unless to increase when they are actually in the problem. The training level will be that, if the learning error may not increase when the model overrides the learning set and you should stop practicing.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “Suppose you use Batch Gradient Descent to train a neural network and you plot the training error at every epoch. If you notice that the ...” in 📗 Computers & Technology if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers