1.

By normalizing the weight at every stage can we prevent divergence?(a) yes(b) noThis question was addressed to me in unit test.I would like to ask this question from Feedback Layer topic in portion Competitive Learning Neural Networks of Neural Networks

Answer» CORRECT choice is (a) yes

The BEST I can EXPLAIN: ||w|| = 1 .


Discussion

No Comment Found

Related InterviewSolutions