官术网_书友最值得收藏!

Hyperparameters tuning

The preceding experiments gave a sense of what the opportunities for fine-tuning a net are. However, what is working for this example is not necessarily working for other examples. For a given net, there are indeed multiple parameters that can be optimized (such as the number of hidden neurons, BATCH_SIZE, number of epochs, and many more according to the complexity of the net itself).

Hyperparameter tuning is the process of finding the optimal combination of those parameters that minimize cost functions. The key idea is that if we have n parameters, then we can imagine that they define a space with n dimensions, and the goal is to find the point in this space which corresponds to an optimal value for the cost function. One way to achieve this goal is to create a grid in this space and systematically check for each grid vertex what the value assumed by the cost function is. In other words, the parameters are pided into buckets, and different combinations of values are checked via a brute force approach.

主站蜘蛛池模板: 新巴尔虎左旗| 高清| 常德市| 墨玉县| 苏尼特左旗| 建湖县| 淳安县| 白河县| 马关县| 卢湾区| 阿拉尔市| 和硕县| 苍山县| 庄浪县| 双柏县| 阿拉善左旗| 达日县| 台山市| 陇西县| 法库县| 万年县| 仁寿县| 博湖县| 宁安市| 泸定县| 惠安县| 阳信县| 宿松县| 交城县| 弥渡县| 嵊州市| 彭泽县| 江西省| 鹤庆县| 华容县| 布拖县| 葫芦岛市| 武定县| 四子王旗| 台州市| 宣城市|