官术网_书友最值得收藏!

Reasoning in high-dimensional spaces

Working with feature spaces of high dimensions requires special mental precautions, since our intuition used to deal with three-dimensional space starts to fail. For example, let's look at one peculiar property of n-dimensional spaces, known as an n-ball volume problem. N-ball is just a ball in n-dimensional Euclidean space. If we plot the volume of such n-ball (y axis) as a function of a number of dimensions (x axis), we'll see the following graph:

Figure 3.9: Volume of n-ball in n-dimensional space

Note that at the beginning the volume rises, until it reaches its peak in five-dimensional space, and then starts decreasing. What does it mean for our models? Specifically, for KNN, it means that starting from five features, the more features you have the greater should be the radius of the sphere centered on the point you're trying to classify to cover KNN.

The counter-intuitive phenomena that arise in a high-dimensional space are colloquially known as the curse of dimensionality. This includes a wide range of phenomena that can't be observed in the three-dimensional space we used to deal with. Pedro Domingos, in his A Few Useful Things to Know about Machine Learning, provides some examples:

"In high dimensions, most of the mass of a multivariate Gaussian distribution is not near the mean, but in an increasingly distant shell around it; and most of the volume of a high-dimensional orange is in the skin, not the pulp. If a constant number of examples is distributed uniformly in a high-dimensional hypercube, beyond some dimensionality most examples are closer to a face of the hypercube than to their nearest neighbor. And if we approximate a hypersphere by inscribing it in a hypercube, in high dimensions almost all the volume of the hypercube is outside the hypersphere. This is bad news for machine learning, where shapes of one type are often approximated by shapes of another."

Speaking specifically of KNN, it treats all dimensions as equally important. This creates problems when some of the features are irrelevant, especially in high dimensions, because the noise introduced by these irrelevant features suppresses the signal comprised in the good features. In our example, we bypassed multidimensional problems by taking into account only the magnitude of each three-dimensional vector in our motion signals.

主站蜘蛛池模板: 扶沟县| 灵丘县| 北宁市| 罗甸县| 桂阳县| 五大连池市| 依兰县| 枞阳县| 阿拉善右旗| 容城县| 定南县| 昌黎县| 辉县市| 微博| 布拖县| 武平县| 和田县| 惠州市| 论坛| 安丘市| 揭东县| 彰化市| 五大连池市| 莱阳市| 内乡县| 武定县| 镇雄县| 营口市| 尉犁县| 台州市| 冷水江市| 宣城市| 安远县| 禹城市| 胶南市| 莱西市| 河东区| 鄯善县| 汝城县| 荥阳市| 于都县|