1.

What Is Backprop?

Answer»

"Backprop" is short for "backpropagation of ERROR". The term backpropagation causes much CONFUSION. Strictly speaking, backpropagation refers to the METHOD for computing the gradient of the case-wise error function with respect to the weights for a feedforward network, a straightforward but elegant application of the chain rule of elementary calculus (Werbos 1974/1994). By extension, backpropagation or backprop refers to a training method that USES backpropagation to compute the gradient. By further extension, a backprop network is a feedforward network trained by backpropagation.

"Backprop" is short for "backpropagation of error". The term backpropagation causes much confusion. Strictly speaking, backpropagation refers to the method for computing the gradient of the case-wise error function with respect to the weights for a feedforward network, a straightforward but elegant application of the chain rule of elementary calculus (Werbos 1974/1994). By extension, backpropagation or backprop refers to a training method that uses backpropagation to compute the gradient. By further extension, a backprop network is a feedforward network trained by backpropagation.



Discussion

No Comment Found