- Mastering Machine Learning on AWS
- Dr. Saket S.R. Mengle Maximo Gurmendez
- 191字
- 2021-06-24 14:23:17
Gradient descent
The gradient descent algorithm is also popular for estimating parameters for linear regression. The gradient descent algorithm is used to minimize a function. Based on what we are predicting, we start with a set of initial values for the parameters and iteratively move toward the parameters to minimize the error in the function. The function to iteratively minimize error is called the gradient. The idea is to descend the gradient toward the lowest point in the gradient plane. Different types of gradient descent algorithms include batch gradient descent, which looks at all observed examples in each example, and stochastic gradient descent, where we iterate with only one observation at a time. For this reason, batch gradient descent is more accurate than stochastic gradient descent, but is much slower and hence not suitable for large datasets.
There is a vast amount of research being done on regression algorithms as they are very well suited for predicting continuous variables. We encourage you to learn more about linear regression libraries and try different variants that are provided in the library to calculate the efficiency and effectiveness of the test datasets.
- 觸摸屏實用技術(shù)與工程應(yīng)用
- Windows phone 7.5 application development with F#
- 數(shù)字道路技術(shù)架構(gòu)與建設(shè)指南
- 嵌入式技術(shù)基礎(chǔ)與實踐(第5版)
- 3ds Max Speed Modeling for 3D Artists
- Camtasia Studio 8:Advanced Editing and Publishing Techniques
- 筆記本電腦維修不是事兒(第2版)
- 計算機組裝與維修技術(shù)
- Building 3D Models with modo 701
- 超大流量分布式系統(tǒng)架構(gòu)解決方案:人人都是架構(gòu)師2.0
- 基于Proteus仿真的51單片機應(yīng)用
- Arduino項目案例:游戲開發(fā)
- FPGA實戰(zhàn)訓(xùn)練精粹
- 嵌入式系統(tǒng)設(shè)計大學(xué)教程(第2版)
- Hands-On One-shot Learning with Python