官术网_书友最值得收藏!

Gradient descent 

The gradient descent algorithm is also popular for estimating parameters for linear regression. The gradient descent algorithm is used to minimize a function. Based on what we are predicting, we start with a set of initial values for the parameters and iteratively move toward the parameters to minimize the error in the function. The function to iteratively minimize error is called the gradient. The idea is to descend the gradient toward the lowest point in the gradient plane. Different types of gradient descent algorithms include batch gradient descent, which looks at all observed examples in each example, and stochastic gradient descent, where we iterate with only one observation at a time. For this reason, batch gradient descent is more accurate than stochastic gradient descent, but is much slower and hence not suitable for large datasets.

There is a vast amount of research being done on regression algorithms as they are very well suited for predicting continuous variables. We encourage you to learn more about linear regression libraries and try different variants that are provided in the library to calculate the efficiency and effectiveness of the test datasets.

主站蜘蛛池模板: 精河县| 茶陵县| 兴国县| 修武县| 中阳县| 贵州省| 安福县| 徐汇区| 洛隆县| 浮梁县| 罗城| 会昌县| 沁阳市| 临夏县| 景德镇市| 安国市| 广东省| 石屏县| 呼伦贝尔市| 苏州市| 恩平市| 德格县| 安龙县| 安龙县| 修文县| 垫江县| 富民县| 萍乡市| 马山县| 白朗县| 扬州市| 广昌县| 阿拉善左旗| 津市市| 嘉善县| 梁平县| 兴国县| 大名县| 湘乡市| 焉耆| 济宁市|