官术网_书友最值得收藏!

Cross-entropy

Cross-entropy is the loss during training for classification tasks. A high-level description of cross-entropy is that it computes how much the softmax probabilities or the predictions differ from the true classes. The following is the expression for cross entropy for binary classification with output represented by probability  and the true values by y

As we can see from the preceding expression, the cross-entropy will increase or penalize when the probability of prediction is close to 1 while the true output is 0 and vice versa. The same expression can be extended to K classes.

主站蜘蛛池模板: 沂水县| 铜陵市| 涡阳县| 九龙城区| 耿马| 泰宁县| 卓尼县| 博爱县| 综艺| 吴川市| 开原市| 沅陵县| 鄯善县| 台山市| 尼勒克县| 巨野县| 广汉市| 隆安县| 湘西| 达拉特旗| 虎林市| 夏邑县| 长顺县| 沙雅县| 建水县| 平山县| 确山县| 进贤县| 沈丘县| 来凤县| 天门市| 政和县| 竹北市| 宜良县| 元朗区| 资兴市| 张家港市| 梨树县| 涿州市| 平顺县| 沧源|