- Mastering Machine Learning on AWS
- Dr. Saket S.R. Mengle Maximo Gurmendez
- 191字
- 2021-06-24 14:23:17
Gradient descent
The gradient descent algorithm is also popular for estimating parameters for linear regression. The gradient descent algorithm is used to minimize a function. Based on what we are predicting, we start with a set of initial values for the parameters and iteratively move toward the parameters to minimize the error in the function. The function to iteratively minimize error is called the gradient. The idea is to descend the gradient toward the lowest point in the gradient plane. Different types of gradient descent algorithms include batch gradient descent, which looks at all observed examples in each example, and stochastic gradient descent, where we iterate with only one observation at a time. For this reason, batch gradient descent is more accurate than stochastic gradient descent, but is much slower and hence not suitable for large datasets.
There is a vast amount of research being done on regression algorithms as they are very well suited for predicting continuous variables. We encourage you to learn more about linear regression libraries and try different variants that are provided in the library to calculate the efficiency and effectiveness of the test datasets.
- Intel FPGA/CPLD設計(基礎篇)
- Learning AngularJS Animations
- 電腦軟硬件維修大全(實例精華版)
- 平衡掌控者:游戲數值經濟設計
- Manage Partitions with GParted How-to
- 計算機維修與維護技術速成
- 電腦維護365問
- Learning Game Physics with Bullet Physics and OpenGL
- 微服務分布式架構基礎與實戰:基于Spring Boot + Spring Cloud
- R Deep Learning Essentials
- 計算機組裝與維護(第3版)
- Rapid BeagleBoard Prototyping with MATLAB and Simulink
- Spring Cloud微服務架構實戰
- 筆記本電腦使用、維護與故障排除從入門到精通(第5版)
- Hands-On Artificial Intelligence for Banking