The Backpropagation algorithm looks for the minimum value of the error function in weight space using a technique called the delta rule or gradient descent. The weights that minimize the error function is then considered to be a solution to the learning problem.
Back Propagation
An algorithm that looks for the minimum value of the error function in weight space.
Written by Alejandro Bonilla
Updated over a week ago
Updated over a week ago