Home
About Us
Contact Us
Bookmark
Saved Bookmarks
Current Affairs
General Knowledge
Chemical Engineering
UPSEE
BSNL
ISRO
BITSAT
Amazon
ORACLE
Verbal Ability
→
Deep Learning Tutorial
→
Deep Learning Interview Questions in Deep Learning Tutorial
→
What Are The Benefits Of Mini-batch Gradient Desc...
1.
What Are The Benefits Of Mini-batch Gradient Descent?
Answer»
Computationally efficient compared to stochastic
GRADIENT
descent.
Improve generalization by finding flat minima.
Improving convergence, by using mini-batches we approximating the gradient of the entire training
SET
, which
MIGHT
help to
AVOID
local minima.
Show Answer
Discussion
No Comment Found
Post Comment
Related InterviewSolutions
What Is Weight Initialization In Neural Networks?
What Is An Auto-encoder?
Is It Ok To Connect From A Layer 4 Output Back To A Layer 2 Input?
What Is A Boltzmann Machine?
What Is A Dropout?
What Is An Autoencoder?
What Is A Model Capacity?
What Are Hyperparameters, Provide Some Examples?
What Is The Role Of The Activation Function?
Why Is Zero Initialization Not A Recommended Weight Initialization Technique?
Reply to Comment
×
Name
*
Email
*
Comment
*
Submit Reply
Your experience on this site will be improved by allowing cookies. Read
Cookie Policy
Reject
Allow cookies