官术网_书友最值得收藏!

Data transformation

Data transformation techniques tame the dataset to a format that a machine learning algorithm expects as input and may even help the algorithm to learn faster and achieve better performance. It is also known as data munging or data wrangling. Standardization, for instance, assumes that data follows Gaussian distribution and transforms the values in such a way that the mean value is 0 and the deviation is 1, as follows:

Normalization, on the other hand, scales the values of attributes to a small, specified range, usually between 0 and 1:

Many machine learning toolboxes automatically normalize and standardize the data for you.

The last transformation technique is discretization, which divides the range of a continuous attribute into intervals. Why should we care? Some algorithms, such as decision trees and Naive Bayes prefer discrete attributes. The most common ways to select the intervals are as follows:

  • Equal width: The interval of continuous variables is divided into k equal width intervals
  • Equal frequency: Supposing there are N instances, each of the k intervals contains approximately N or k instances
  • Min entropy: This approach recursively splits the intervals until the entropy, which measures disorder, decreases more than the entropy increase, introduced by the interval split (Fayyad and Irani, 1993)

The first two methods require us to specify the number of intervals, while the last method sets the number of intervals automatically; however, it requires the class variable, which means it won't work for unsupervised machine learning tasks.

主站蜘蛛池模板: 丰镇市| 桂林市| 旌德县| 茶陵县| 鄯善县| 定远县| 韶山市| 温宿县| 安泽县| 抚顺县| 兴仁县| 东兰县| 定南县| 长武县| 周宁县| 甘泉县| 巴青县| 太和县| 阿坝县| 赤城县| 德钦县| 阜新市| 遂溪县| 隆林| 墨江| 县级市| 隆德县| 普安县| 石城县| 岐山县| 双城市| 察雅县| 思南县| 财经| 肃北| 资中县| 肃北| 余庆县| 白河县| 新乐市| 岫岩|