官术网_书友最值得收藏!

Kernel

A non-separable dataset like the one used previously is always a tough thing to deal with, however, there are ways to deal with it. One way is to set the vectors into higher dimensions through transformation. But, can we really do it when we have millions of data or vector in reckoning? It will take lots of computation and, also, time. That's where kernel to saves our day.

We have seen the following equation. In this, only the dot product of the training examples are responsible for making the model learn. Let's try to do a small exercise here:

Let's take two vectors here:

x1=[4,8] 
x2= [20,30]

Now, build a transformation function that will help in transforming these 2D vectors into 3D.

The function to be used in order to transform is the following: 

t(x1,x2)= (x12,x1 x2 √2,x22)

#transformation from 2-D to 3-D vector
def t(x):
return [x[0]**2, np.sqrt(2)*x[0]*x[1], x[1]**2]

Now let's use this function:

x1_3D= t(x1) 
x2_3D= t(x2)

print(np.dot(x1_3D,x2_3D))# the result is 102400

But can't we do this without transforming the values. Kernel can help us in doing it:

def kernel(a, b): 
return a[0]**2 * b[0]**2 + 2*a[0]*b[0]*a[1]*b[1] + a[1]**2 * b[1]**2

It's the time to use this kernel now:

kernel(x1,x2) #the result is 102400

Isn't it quite thrilling to see such an amazing result that is the same as before, without using transformation? So, kernel is a function that leads to the dot-product-like result in another space.

主站蜘蛛池模板: 丰都县| 黑水县| 梅州市| 龙山县| 安丘市| 深泽县| 留坝县| 潜江市| 江山市| 教育| 甘孜县| 沁源县| 雷州市| 满洲里市| 方山县| 乡城县| 吉木萨尔县| 平昌县| 军事| 江山市| 巴彦淖尔市| 西华县| 巨鹿县| 闽侯县| 利津县| 潮安县| 兴仁县| 乐东| 碌曲县| 梨树县| 龙游县| 大丰市| 永兴县| 黄大仙区| 扶风县| 淮北市| 临颍县| 衡东县| 安徽省| 广平县| 宜宾县|