官术网_书友最值得收藏!

Kernel

A non-separable dataset like the one used previously is always a tough thing to deal with, however, there are ways to deal with it. One way is to set the vectors into higher dimensions through transformation. But, can we really do it when we have millions of data or vector in reckoning? It will take lots of computation and, also, time. That's where kernel to saves our day.

We have seen the following equation. In this, only the dot product of the training examples are responsible for making the model learn. Let's try to do a small exercise here:

Let's take two vectors here:

x1=[4,8] 
x2= [20,30]

Now, build a transformation function that will help in transforming these 2D vectors into 3D.

The function to be used in order to transform is the following: 

t(x1,x2)= (x12,x1 x2 √2,x22)

#transformation from 2-D to 3-D vector
def t(x):
return [x[0]**2, np.sqrt(2)*x[0]*x[1], x[1]**2]

Now let's use this function:

x1_3D= t(x1) 
x2_3D= t(x2)

print(np.dot(x1_3D,x2_3D))# the result is 102400

But can't we do this without transforming the values. Kernel can help us in doing it:

def kernel(a, b): 
return a[0]**2 * b[0]**2 + 2*a[0]*b[0]*a[1]*b[1] + a[1]**2 * b[1]**2

It's the time to use this kernel now:

kernel(x1,x2) #the result is 102400

Isn't it quite thrilling to see such an amazing result that is the same as before, without using transformation? So, kernel is a function that leads to the dot-product-like result in another space.

主站蜘蛛池模板: 忻州市| 濮阳市| 元朗区| 平山县| 茂名市| 合川市| 固原市| 界首市| 磐石市| 垦利县| 同心县| 南澳县| 大石桥市| 乌兰浩特市| 常德市| 遂溪县| 陆川县| 梅州市| 宁都县| 双江| 平凉市| 库车县| 轮台县| 邓州市| 彭泽县| 德惠市| 赤壁市| 垦利县| 哈密市| 新乡县| 庄河市| 泽库县| 江津市| 清徐县| 河津市| 博客| 钦州市| 延吉市| 安国市| 惠水县| 武清区|