官术网_书友最值得收藏!

Summary 

In this chapter, we discussed many data manipulation techniques that we will come back to use all the time. It is good for you to spend time doing this now rather than later. It will make our modeling of deep learning architectures easier. 

After reading this chapter, you are now able to manipulate and produce binary data for classification or for feature representation. You also know how to deal with categorical data and labels and prepare it for classification or regression. When you have real-valued data, you now know how to identify statistical properties and how to normalize such data. If you ever have the problem of data that has non-normal or non-uniform distributions, now you know how to fix that. And if you ever encounter problems of not having enough data, you learned a few data augmentation techniques. Toward the end of this chapter, you learned some of the most popular dimensionality reduction techniques. You will learn more of these along the road, for example, when we talk about autoencoders, which can be used for dimensionality reduction as well. But sit tight, we will get there in due time.

For now, we will continue our journey toward the next introductory topic about basic machine learning. Chapter 4Learning from Data, introduces the most elementary concepts around the theory of deep learning, including measuring performance on regression and classification, as well as the identification of overfitting. However, before we go there, please try to quiz yourself with the following questions. 

主站蜘蛛池模板: 攀枝花市| 吉隆县| 丁青县| 汉源县| 巨鹿县| 德惠市| 平湖市| 渑池县| 和静县| 博湖县| 屯留县| 榆中县| 黄龙县| 波密县| 奇台县| 南昌县| 霍林郭勒市| 错那县| 慈利县| 济南市| 扎兰屯市| 科技| 德昌县| 锡林浩特市| 屯门区| 东城区| 湖南省| 屯留县| 昌都县| 上饶市| 洪湖市| 紫金县| 原阳县| 双城市| 吉林市| 冷水江市| 东乌珠穆沁旗| 忻城县| 荆州市| 句容市| 交城县|