Answers

Question and Answer:

  Home  Data Scientist

⟩ Tell me do gradient descent methods always converge to same point?

No, they do not because in some cases it reaches a local minima or a local optima point. You don’t reach the global optima point. It depends on the data and starting conditions

 150 views

More Questions for you: