Answers

Question and Answer:

  Home  Data Scientist

⟩ Explain me do gradient descent methods at all times converge to a similar point?

No, they do not because in some cases they reach a local minima or a local optima point. You would not reach the global optima point. This is governed by the data and the starting conditions.

 137 views

More Questions for you: