SGD (Stochastic Gradient Descent)

A common optimiser that updates model parameters using small, randomly selected subsets (batches) of the training data, speeding up convergence.

error: Thank you for visiting! This content is protected. We appreciated your understanding.
Scroll to Top