1.

Is it necessary to set initial weights in prceptron convergence theorem to zero?(a) yes(b) noThe question was asked in exam.My doubt stems from Pattern Classification topic in division Feedforward Neural Networks of Neural Networks

Answer»

Right choice is (b) no

Easy explanation: INITIAL SETTING of weights doesn’t AFFECT PERCEPTRON convergence theorem.



Discussion

No Comment Found

Related InterviewSolutions