官术网_书友最值得收藏!

Hyperparameters tuning

The preceding experiments gave a sense of what the opportunities for fine-tuning a net are. However, what is working for this example is not necessarily working for other examples. For a given net, there are indeed multiple parameters that can be optimized (such as the number of hidden neurons, BATCH_SIZE, number of epochs, and many more according to the complexity of the net itself).

Hyperparameter tuning is the process of finding the optimal combination of those parameters that minimize cost functions. The key idea is that if we have n parameters, then we can imagine that they define a space with n dimensions, and the goal is to find the point in this space which corresponds to an optimal value for the cost function. One way to achieve this goal is to create a grid in this space and systematically check for each grid vertex what the value assumed by the cost function is. In other words, the parameters are pided into buckets, and different combinations of values are checked via a brute force approach.

主站蜘蛛池模板: 水富县| 望江县| 尉氏县| 泰安市| 漳平市| 石景山区| 鄂温| 罗甸县| 龙川县| 建湖县| 平度市| 灵寿县| 三穗县| 英超| 海伦市| 易门县| 项城市| 马公市| 武威市| 东山县| 大洼县| 甘洛县| 白城市| 辽阳县| 新源县| 汝阳县| 兴文县| 陵川县| 潮安县| 贺州市| 拜城县| 扶绥县| 兴仁县| 临泽县| 深泽县| 固镇县| 兴文县| 县级市| 盐山县| 江阴市| 治县。|