1.

What is the output of the program below? Assume that it is being executed using Python 3.x environment. Will the session close automatically or remain open at the end of the program.

Answer»

Gradient DESCENT is a first-order iterative optimization algorithm for finding the minimum of a FUNCTION. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient (or approximate gradient) of the function at the current point. If, instead, one takes steps proportional to the positive of the gradient, one APPROACHES a local maximum of that function; the procedure is then known as gradient ascent.

  1. The steps to use gradient descent algorithm is as follows:
  2. Initialize random weight and bias
  3. Pass an INPUT through the network and get values from the output layer
  4. Calculate error between the ACTUAL value and the predicted value
  5. Go to each neurons which contributes to the error and change its respective values to reduce the error
  6. Reiterate until we find the best weights of the network

Below is an illustrative diagram of gradient descent on a series of level sets.



Discussion

No Comment Found