| 1. |
Weight Initialization In Neural Networks? |
|
Answer» WEIGHT initialization is a very important step. Bad weight initialization can prevent a network from LEARNING. Good initialization can lead to QUICKER CONVERGENCE and better overall error. Biases can be generally initialized to zero. The GENERAL rule for setting the weights is to be close to zero without being too small. Weight initialization is a very important step. Bad weight initialization can prevent a network from learning. Good initialization can lead to quicker convergence and better overall error. Biases can be generally initialized to zero. The general rule for setting the weights is to be close to zero without being too small. |
|