- Machine Learning Quick Reference
- Rahul Kumar
- 126字
- 2021-08-20 10:05:11
Back to Kernel trick
So, now we have got a fair understanding of kernel and its importance. And, as discussed in the last section, the kernel function is:
K(xi,xj)= xi . xj
So, now the margin problem becomes the following:

This is subject to 0 ≤ αi ≤ C, for any i = 1, ..., m:

Applying the kernel trick simply means replacing the dot product of two examples with a kernel function.
Now even the hypothesis function will change as well:

This function will be able to decide on and classify the categories. Also, since S denotes the set of support vectors, it implies that we need to compute the kernel function only on support vectors.
推薦閱讀
- Hands-On Deep Learning with Apache Spark
- 空間機器人遙操作系統及控制
- Dreamweaver CS3網頁制作融會貫通
- 網頁編程技術
- Hands-On Machine Learning with TensorFlow.js
- 大數據挑戰與NoSQL數據庫技術
- Photoshop CS3圖像處理融會貫通
- Windows游戲程序設計基礎
- 愛犯錯的智能體
- 單片機技術一學就會
- Azure PowerShell Quick Start Guide
- 嵌入式操作系統原理及應用
- Machine Learning Algorithms(Second Edition)
- 生成對抗網絡項目實戰
- 工業機器人集成應用