官术网_书友最值得收藏!

Cross-entropy

Cross-entropy is the loss during training for classification tasks. A high-level description of cross-entropy is that it computes how much the softmax probabilities or the predictions differ from the true classes. The following is the expression for cross entropy for binary classification with output represented by probability  and the true values by y

As we can see from the preceding expression, the cross-entropy will increase or penalize when the probability of prediction is close to 1 while the true output is 0 and vice versa. The same expression can be extended to K classes.

主站蜘蛛池模板: 正宁县| 墨竹工卡县| 池州市| 海门市| 城固县| 凤台县| 班戈县| 陈巴尔虎旗| 娱乐| 日喀则市| 安阳县| 长子县| 古交市| 郸城县| 宝兴县| 镇宁| 五河县| 洞口县| 加查县| 巴林右旗| 高淳县| 长春市| 高阳县| 大城县| 和顺县| 平山县| 天峻县| 太原市| 米泉市| 蓬溪县| 南投市| 南通市| 额尔古纳市| 鄢陵县| 江都市| 彩票| 名山县| 石城县| 融水| 上林县| 多伦县|