- Machine Learning Quick Reference
- Rahul Kumar
- 142字
- 2021-08-20 10:05:11
Kernel trick
We have already seen that SVM works smoothly when it comes to having linear separable data. Just have a look at the following figure; it depicts that vectors are not linearly separable, but the noticeable part is that it is not being separable in 2D space:

With a few adjustments, we can still make use of SVM here.
Transformation of a two-dimensional vector into a 3D vector or any other higher dimensional vector can set things right for us. The next step would be to train the SVM using a higher dimensional vector. But the question arises of how high in dimension we should go to transform the vector. What this means is if the transformation has to be a two-dimensional vector, or 3D or 4D or more. It actually depends on the which brings separability into the dataset.
- 虛擬儀器設(shè)計測控應(yīng)用典型實例
- 21小時學(xué)通AutoCAD
- Python Artificial Intelligence Projects for Beginners
- ETL with Azure Cookbook
- 深度學(xué)習(xí)中的圖像分類與對抗技術(shù)
- 水晶石精粹:3ds max & ZBrush三維數(shù)字靜幀藝術(shù)
- Android游戲開發(fā)案例與關(guān)鍵技術(shù)
- SAP Business Intelligence Quick Start Guide
- 人工智能:智能人機交互
- 手把手教你學(xué)Flash CS3
- Data Science with Python
- 數(shù)字中國:大數(shù)據(jù)與政府管理決策
- C# 2.0實例自學(xué)手冊
- 新手學(xué)Photoshop CS6數(shù)碼照片處理
- 編程大講壇:Visual Basic核心開發(fā)技術(shù)從入門到精通