官术网_书友最值得收藏!

Gradient descent 

The gradient descent algorithm is also popular for estimating parameters for linear regression. The gradient descent algorithm is used to minimize a function. Based on what we are predicting, we start with a set of initial values for the parameters and iteratively move toward the parameters to minimize the error in the function. The function to iteratively minimize error is called the gradient. The idea is to descend the gradient toward the lowest point in the gradient plane. Different types of gradient descent algorithms include batch gradient descent, which looks at all observed examples in each example, and stochastic gradient descent, where we iterate with only one observation at a time. For this reason, batch gradient descent is more accurate than stochastic gradient descent, but is much slower and hence not suitable for large datasets.

There is a vast amount of research being done on regression algorithms as they are very well suited for predicting continuous variables. We encourage you to learn more about linear regression libraries and try different variants that are provided in the library to calculate the efficiency and effectiveness of the test datasets.

主站蜘蛛池模板: 达日县| 盐源县| 龙海市| 佛山市| 丹阳市| 伊金霍洛旗| 黑龙江省| 石柱| 大竹县| 恩施市| 永州市| 灵璧县| 乳源| 衢州市| 雅江县| 河西区| 长垣县| 民勤县| 阜南县| 大邑县| 浮梁县| 疏附县| 灌南县| 易门县| 栾城县| 建平县| 张家界市| 南和县| 永泰县| 中山市| 大英县| 刚察县| 辽宁省| 金坛市| 资溪县| 黑山县| 疏附县| 眉山市| 东至县| 东光县| 萨迦县|