Gradient Descent Method
254
浏览
0
关注

Gradient descent is a First-order approximation|first-order Mathematical optimization|optimization algorithm. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient (or of the approximate gradient) of the function at the current point. If instead one takes steps proportional to the positive of the gradient, one approaches a local maximum of that function; the procedure is then known as gradient ascent. Gradient descent is also known as steepest descent, or the method of steepest descent. When known as the latter, gradient descent should not be confused with the method of steepest descent for approximating integrals.
相关概念
Local Minima    
Fuzzy Rules    
主要的会议/期刊
演化趋势
Chart will load here
Gradient Descent Method文章数量变化趋势

Feedback
Feedback
Feedback
我想反馈:
排行榜