- Machine Learning with Swift
- Alexander Sosnovshchenko
- 222字
- 2021-06-24 18:54:57
Tuning hyperparameters
The simplest way to simplify the decision tree is to limit its depth. How deep is it now? You can see 20 splits, or 21 layers, in Figure 2.5. At the same time, we have only three features. There are six of them actually, if we are taking into account one-hot encoded categorical color. Let's limit the maximum depth of the tree aggressively to be comparable with the number of features. tree_model object has a max_depth property, and so we're setting it to be less than the number of features:
In []: tree_model.max_depth = 4
After these manipulations, we can retrain our model and reevaluate its accuracy:
In []: tree_model = tree_model.fit(X_train, y_train) tree_model.score(X_train, y_train) Out[]: 0.90571428571428569
Note that accuracy on training is now set less by about 6%. How about test set?
In []: tree_model.score(X_test, y_test) Out[]: 0.92000000000000004
Accuracy on previously unseen data is now higher, by about 4%. This doesn't look like a great achievement, until you realize that it's an additional 40 correctly classified creatures from our initial set of 1,000. In modern machine learning contests, the final difference between 1st and 100th place can easily be about 1%.
Let's draw a tree structure after pruning. Code for this visualization is the same as before:

- 用“芯”探核:龍芯派開(kāi)發(fā)實(shí)戰(zhàn)
- Windows phone 7.5 application development with F#
- 電腦維護(hù)365問(wèn)
- R Deep Learning Essentials
- STM32嵌入式技術(shù)應(yīng)用開(kāi)發(fā)全案例實(shí)踐
- 筆記本電腦維修實(shí)踐教程
- 超大流量分布式系統(tǒng)架構(gòu)解決方案:人人都是架構(gòu)師2.0
- 無(wú)蘋果不生活:OS X Mountain Lion 隨身寶典
- Neural Network Programming with Java(Second Edition)
- 3D Printing Blueprints
- The Artificial Intelligence Infrastructure Workshop
- Blender for Video Production Quick Start Guide
- USB應(yīng)用開(kāi)發(fā)寶典
- 嵌入式系統(tǒng)設(shè)計(jì)大學(xué)教程(第2版)
- The Machine Learning Workshop