官术网_书友最值得收藏!

Hyperparameter tuning

ML or deep learning algorithms take hyperparameters as input prior to training the model. Each algorithm comes with its own set of hyperparameters and some algorithms may have zero hyperparameters.

Hyperparameter tuning is an important step in model building. Each of the ML algorithms comes with some default hyperparameter values that are generally used to build an initial model, unless the practitioner manually overrides the hyperparameters. Setting the right combination of hyperparameters and the right hyperparameter values for the model greatly improves the performance of the model in most cases. Hence, it is strongly recommended that one does hyperparameter tuning as part of ML model building. Searching through the possible universe of hyperparameter values is a very time-consuming task.

The k in k-means clustering and k-nearest neighbors classification, the number of tress and the depth of tress in random forest, and eta in XGBoost are all examples of hyperparameters.

Grid search and Bayesian optimization-based hyperparameter tuning are two popular methods of hyperparameter tuning among practitioners.

主站蜘蛛池模板: 隆德县| 丹东市| 邵阳市| 剑阁县| 苗栗县| 壶关县| 南岸区| 崇左市| 上思县| 宁强县| 岑溪市| 临汾市| 金秀| 筠连县| 海宁市| 调兵山市| 长岭县| 桃园县| 亳州市| 饶河县| 依兰县| 桦甸市| 富宁县| 当涂县| 行唐县| 连平县| 屯留县| 垦利县| 波密县| 奈曼旗| 玉山县| 永州市| 白水县| 宜宾县| 应用必备| 新乡县| 马关县| 南京市| 依安县| 静乐县| 五家渠市|