官术网_书友最值得收藏!

The gradient descent algorithm

The gradient descent algorithm is an optimization algorithm to find the minimum of the function using first order derivatives, that is, we differentiate functions with respect to their parameters to first order only. Here, the objective of the gradient descent algorithm would be to minimize the cost function with regards to and.

This approach includes following steps for numerous iterations to minimize :

used in the above equations refers to the learning rate. The learning rate is the speed at which the learning agent adapts to new knowledge. Thus, , that is, the learning rate is a hyperparameter that needs to be assigned as a scalar value or as a function of time. In this way, in every iteration, the values of and are updated as mentioned in the preceding formula until the value of the cost function reaches an acceptable minimum value.

The gradient descent algorithm means moving down the slope. The slope of the the curve is represented by the cost function with regards to the parameters. The gradient, that is, the slope, gives the direction of increasing slope if it's positive, and decreasing if it's negative. Thus, we use a negative sign to multiply with our slope since we have to go opposite to the direction of the increasing slope and toward the direction of the decreasing.

Using the optimum learning rate, , the descent is controlled and we don't overshoot the local minimum. If the learning rate, , is very small, then convergence will take more time, while if it's very high then it might overshoot and miss the minimum and diverge owing to the large number of iterations:

主站蜘蛛池模板: 乌苏市| 水富县| 莱阳市| 四子王旗| 清河县| 开封县| 绵竹市| 台东县| 阿拉善右旗| 和静县| 阆中市| 称多县| 凤山市| 台中县| 深圳市| 华坪县| 东方市| 汉沽区| 兴安盟| 琼中| 黄石市| 井研县| 甘孜县| 南部县| 乌兰浩特市| 海伦市| 中山市| 会昌县| 英山县| 太湖县| 安义县| 安国市| 乐至县| 新营市| 沂南县| 汽车| 达拉特旗| 麻栗坡县| 肥乡县| 霍城县| 新津县|