官术网_书友最值得收藏!

Summary

In this chapter, we covered quite a lot of ground, didn't we!

In short, we learned a lot about different supervised learning algorithms, how to apply them to real datasets, and how to implement everything in OpenCV. We introduced classification algorithms such as k-NN and logistic regression and discussed how they could be used to predict labels as two or more discrete categories. We introduced various variants of linear regression (such as Lasso regression and ridge regression) and discussed how they could be used to predict continuous variables. Last but not least, we got acquainted with the Iris and Boston datasets, two classics in the history of machine learning.

In the following chapters, we will go into much greater depth within these topics, and see some more interesting examples of where these concepts can be useful.

But first, we need to talk about another essential topic of machine learning, feature engineering. Often, data does not come in nicely formatted datasets, and it is our responsibility to represent the data in a meaningful way. Therefore, the next chapter will talk all about representing features and engineering data.

主站蜘蛛池模板: 博兴县| 西贡区| 奉化市| 仁化县| 台东市| 通化县| 泗洪县| 富锦市| 铜陵市| 弋阳县| 文水县| 正镶白旗| 军事| 克拉玛依市| 温州市| 论坛| 唐河县| 龙口市| 夏河县| 井冈山市| 林芝县| 苏尼特左旗| 环江| 景德镇市| 建瓯市| 洛隆县| 兰西县| 万宁市| 丰原市| 蛟河市| 开平市| 儋州市| 资中县| 安宁市| 靖边县| 绵阳市| 舞阳县| 新巴尔虎右旗| 商水县| 信宜市| 文安县|