官术网_书友最值得收藏!

Classification loss function

The loss function is an objective function to minimize during training to get the best model. Many different loss functions exist.

In a classification problem, where the target is to predict the correct class among k classes, cross-entropy is commonly used as it measures the difference between the real probability distribution, q, and the predicted one, p, for each class:

Here, i is the index of the sample in the dataset, n is the number of samples in the dataset, and k is the number of classes.

While the real probability

of each class is unknown, it can simply be approximated in practice by the empirical distribution, that is, randomly drawing a sample out of the dataset in the dataset order. The same way, the cross-entropy of any predicted probability, p, can be approximated by the empirical cross-entropy:

Here,

is the probability estimated by the model for the correct class of example

.

Accuracy and cross-entropy both evolve in the same direction but measure different things. Accuracy measures how much the predicted class is correct, while cross-entropy measure the distance between the probabilities. A decrease in cross-entropy explains that the probability to predict the correct class gets better, but the accuracy may remain constant or drop.

While accuracy is discrete and not differentiable, the cross-entropy loss is a differentiable function that can be easily used for training a model.

主站蜘蛛池模板: 边坝县| 萨迦县| 内江市| 门头沟区| 舒兰市| 锡林浩特市| 呼玛县| 奎屯市| 东乡族自治县| 福贡县| 文登市| 临桂县| 凤城市| 浠水县| 怀宁县| 龙里县| 安远县| 澄江县| 张家港市| 日喀则市| 武夷山市| 桐梓县| 博乐市| 墨江| 岳西县| 鹤山市| 罗平县| 高州市| 宜章县| 泸定县| 阜康市| 东方市| 晋州市| 阳高县| 奉化市| 米泉市| 瑞丽市| 墨竹工卡县| 合山市| 确山县| 资溪县|