官术网_书友最值得收藏!

Summary

In this chapter, we covered quite a lot of ground, didn't we!

In short, we learned a lot about different supervised learning algorithms, how to apply them to real datasets, and how to implement everything in OpenCV. We introduced classification algorithms such as k-NN and logistic regression and discussed how they could be used to predict labels as two or more discrete categories. We introduced various variants of linear regression (such as Lasso regression and ridge regression) and discussed how they could be used to predict continuous variables. Last but not least, we got acquainted with the Iris and Boston datasets, two classics in the history of machine learning.

In the following chapters, we will go into much greater depth within these topics, and see some more interesting examples of where these concepts can be useful.

But first, we need to talk about another essential topic of machine learning, feature engineering. Often, data does not come in nicely formatted datasets, and it is our responsibility to represent the data in a meaningful way. Therefore, the next chapter will talk all about representing features and engineering data.

主站蜘蛛池模板: 定边县| 广宗县| 烟台市| 忻州市| 台南县| 阜平县| 迁西县| 中超| 东辽县| 泌阳县| 玉环县| 九江市| 棋牌| 陇西县| 绥芬河市| 长海县| 金塔县| 宜宾市| 汉寿县| 卢湾区| 石嘴山市| 错那县| 南乐县| 凉城县| 双桥区| 绍兴市| 吉木乃县| 阿合奇县| 镇平县| 德安县| 循化| 吉木乃县| 南通市| 永仁县| 富裕县| 达孜县| 曲阳县| 峡江县| 甘谷县| 吴江市| 霞浦县|