官术网_书友最值得收藏!

  • Machine Learning in Java
  • AshishSingh Bhatia Bostjan Kaluza
  • 246字
  • 2021-06-10 19:29:57

Data transformation

Data transformation techniques tame the dataset to a format that a machine learning algorithm expects as input and may even help the algorithm to learn faster and achieve better performance. It is also known as data munging or data wrangling. Standardization, for instance, assumes that data follows Gaussian distribution and transforms the values in such a way that the mean value is 0 and the deviation is 1, as follows:

Normalization, on the other hand, scales the values of attributes to a small, specified range, usually between 0 and 1:

Many machine learning toolboxes automatically normalize and standardize the data for you.

The last transformation technique is discretization, which divides the range of a continuous attribute into intervals. Why should we care? Some algorithms, such as decision trees and Naive Bayes prefer discrete attributes. The most common ways to select the intervals are as follows:

  • Equal width: The interval of continuous variables is divided into k equal width intervals
  • Equal frequency: Supposing there are N instances, each of the k intervals contains approximately N or k instances
  • Min entropy: This approach recursively splits the intervals until the entropy, which measures disorder, decreases more than the entropy increase, introduced by the interval split (Fayyad and Irani, 1993)

The first two methods require us to specify the number of intervals, while the last method sets the number of intervals automatically; however, it requires the class variable, which means it won't work for unsupervised machine learning tasks.

主站蜘蛛池模板: 绥宁县| 繁峙县| 双柏县| 宿松县| 崇阳县| 和田市| 连南| 康保县| 定边县| 金川县| 泰顺县| 大方县| 溧水县| 榆林市| 新泰市| 西畴县| 扎兰屯市| 郴州市| 淄博市| 文山县| 板桥市| 凤台县| 册亨县| 伊宁县| 满城县| 临江市| 乐清市| 黑山县| 阳高县| 双江| 大姚县| 佛坪县| 武鸣县| 扎兰屯市| 平顺县| 屏边| 景德镇市| 田东县| 永新县| 将乐县| 青海省|