1.

In hebbian learning intial weights are set?(a) random(b) near to zero(c) near to target value(d) near to target valueThe question was posed to me at a job interview.The query is from Learning topic in chapter Basics of Artificial Neural Networks of Neural Networks

Answer»

The CORRECT option is (b) near to zero

To explain I WOULD say: Hebb law LEAD to sum of correlations between input & output, inorder to achieve this, the starting initial WEIGHT values must be small.



Discussion

No Comment Found

Related InterviewSolutions