1.

What do you mean by hyperparameters in the context of deep learning?

Answer»

Hyperparameters are variables that determine the network topology (for example, the number of hidden units) and how the network is trained (Eg: Learning RATE). They are SET before training the MODEL, that is, before optimizing the weights and the bias. 

Following are some of the examples of hyperparameters:-

  • Number of hidden layers: With regularisation techniques, many hidden units inside a layer can boost accuracy. Underfitting may occur if the number of units is reduced. 
  • Learning Rate: The learning rate is the rate at which a network's parameters are UPDATED. The learning process is slowed by a low learning rate, but it EVENTUALLY converges. A faster learning rate accelerates the learning process, but it may not converge. A declining Learning rate is usually desired.


Discussion

No Comment Found