1.

What is backpropagation how does it work?

Answer»

Backpropagation, short for "backward propagation of ERRORS," is an algorithm for supervised learning of artificial neural networks USING gradient descent. Given an artificial neural network and an error FUNCTION, the method calculates the gradient of the error function with respect to the neural network's weights.
Each neuron in a LAYER has its own set of weights — so while each neuron in a layer is looking at the same inputs, their outputs will all be different. When using a neural network to approximate a function, the data is forwarded through the network layer-by-layer until it REACHES the final layer.

IF IT IS CORRECT MARK BRAINLIEST ✌️



Discussion

No Comment Found