官术网_书友最值得收藏!

Reasoning in high-dimensional spaces

Working with feature spaces of high dimensions requires special mental precautions, since our intuition used to deal with three-dimensional space starts to fail. For example, let's look at one peculiar property of n-dimensional spaces, known as an n-ball volume problem. N-ball is just a ball in n-dimensional Euclidean space. If we plot the volume of such n-ball (y axis) as a function of a number of dimensions (x axis), we'll see the following graph:

Figure 3.9: Volume of n-ball in n-dimensional space

Note that at the beginning the volume rises, until it reaches its peak in five-dimensional space, and then starts decreasing. What does it mean for our models? Specifically, for KNN, it means that starting from five features, the more features you have the greater should be the radius of the sphere centered on the point you're trying to classify to cover KNN.

The counter-intuitive phenomena that arise in a high-dimensional space are colloquially known as the curse of dimensionality. This includes a wide range of phenomena that can't be observed in the three-dimensional space we used to deal with. Pedro Domingos, in his A Few Useful Things to Know about Machine Learning, provides some examples:

"In high dimensions, most of the mass of a multivariate Gaussian distribution is not near the mean, but in an increasingly distant shell around it; and most of the volume of a high-dimensional orange is in the skin, not the pulp. If a constant number of examples is distributed uniformly in a high-dimensional hypercube, beyond some dimensionality most examples are closer to a face of the hypercube than to their nearest neighbor. And if we approximate a hypersphere by inscribing it in a hypercube, in high dimensions almost all the volume of the hypercube is outside the hypersphere. This is bad news for machine learning, where shapes of one type are often approximated by shapes of another."

Speaking specifically of KNN, it treats all dimensions as equally important. This creates problems when some of the features are irrelevant, especially in high dimensions, because the noise introduced by these irrelevant features suppresses the signal comprised in the good features. In our example, we bypassed multidimensional problems by taking into account only the magnitude of each three-dimensional vector in our motion signals.

主站蜘蛛池模板: 遂川县| 汝阳县| 商南县| 江油市| 游戏| 开远市| 黑河市| 新乡市| 民县| 奉贤区| 西峡县| 碌曲县| 定州市| 吴桥县| 乌兰察布市| 崇明县| 会泽县| 凉山| 长宁区| 河北区| 鄯善县| 昭通市| 常宁市| 白城市| 大化| 白银市| 辉县市| 和平县| 日土县| 南岸区| 涟源市| 关岭| 屯昌县| 北安市| 云林县| 陆丰市| 临西县| 泾阳县| 都昌县| 庆安县| 桐梓县|