- Mastering Predictive Analytics with scikit:learn and TensorFlow
- Alan Fontaine
- 218字
- 2021-07-23 16:42:24
Bagging
Bagging, also known as bootstrap aggregation, is a general purpose procedure for reducing variance in the machine learning model. It is based on the bootstrap sampling technique and is generally used with regression or classification trees, but in principle this bagging technique can be used with any model.
The following steps are involved in the bagging process:
- We choose the number of estimators or individual models to use. Let's consider this as parameter B.
- We take sample datasets from B with replacement using the bootstrap sampling from the training set.
- For every one of these training datasets, we fit the machine learning model in each of the bootstrap samples. This way, we get individual predictors for the B parameter.
- We get the ensemble prediction by aggregating all of the individual predictions.
In the regression problem, the most common way to get the ensemble prediction would be to find the average of all of the individual predictions.
In the classification problem, the most common way to get the aggregated predictions is by doing a majority vote. The majority vote can be explained by an example. Let's say that we have 100 individual predictors and 80 of them vote for one particular category. Then, we choose that category as our aggregated prediction. This is what a majority vote means.
- Word 2003、Excel 2003、PowerPoint 2003上機指導(dǎo)與練習(xí)
- 人工免疫算法改進及其應(yīng)用
- 商戰(zhàn)數(shù)據(jù)挖掘:你需要了解的數(shù)據(jù)科學(xué)與分析思維
- 手把手教你學(xué)AutoCAD 2010
- PIC單片機C語言非常入門與視頻演練
- MCSA Windows Server 2016 Certification Guide:Exam 70-741
- 智能工業(yè)報警系統(tǒng)
- 最簡數(shù)據(jù)挖掘
- 機器人人工智能
- 奇點將至
- Visual Studio 2010 (C#) Windows數(shù)據(jù)庫項目開發(fā)
- 強化學(xué)習(xí)
- ZigBee無線通信技術(shù)應(yīng)用開發(fā)
- Unreal Development Kit Game Design Cookbook
- Kubernetes on AWS