官术网_书友最值得收藏!

The gradient descent algorithm

The gradient descent algorithm is an optimization algorithm to find the minimum of the function using first order derivatives, that is, we differentiate functions with respect to their parameters to first order only. Here, the objective of the gradient descent algorithm would be to minimize the cost function with regards to and.

This approach includes following steps for numerous iterations to minimize :

used in the above equations refers to the learning rate. The learning rate is the speed at which the learning agent adapts to new knowledge. Thus, , that is, the learning rate is a hyperparameter that needs to be assigned as a scalar value or as a function of time. In this way, in every iteration, the values of and are updated as mentioned in the preceding formula until the value of the cost function reaches an acceptable minimum value.

The gradient descent algorithm means moving down the slope. The slope of the the curve is represented by the cost function with regards to the parameters. The gradient, that is, the slope, gives the direction of increasing slope if it's positive, and decreasing if it's negative. Thus, we use a negative sign to multiply with our slope since we have to go opposite to the direction of the increasing slope and toward the direction of the decreasing.

Using the optimum learning rate, , the descent is controlled and we don't overshoot the local minimum. If the learning rate, , is very small, then convergence will take more time, while if it's very high then it might overshoot and miss the minimum and diverge owing to the large number of iterations:

主站蜘蛛池模板: 伊吾县| 伊金霍洛旗| 新源县| 竹溪县| 永德县| 叙永县| 天门市| 井研县| 神农架林区| 平度市| 于都县| 安新县| 石阡县| 鲁山县| 安新县| 唐河县| 宜兰县| 永胜县| 象山县| 东源县| 正蓝旗| 安仁县| 循化| 龙游县| 阿勒泰市| 涡阳县| 彝良县| 红桥区| 金坛市| 延边| 喀喇沁旗| 菏泽市| 韶山市| 潮州市| 元江| 保定市| 嘉黎县| 宁夏| 伊川县| 望谟县| 剑阁县|