- Hands-On Natural Language Processing with Python
- Rajesh Arumugam Rajalingappaa Shanmugamani
- 99字
- 2021-08-13 16:01:47
Cross-entropy
Cross-entropy is the loss during training for classification tasks. A high-level description of cross-entropy is that it computes how much the softmax probabilities or the predictions differ from the true classes. The following is the expression for cross entropy for binary classification with output represented by probability and the true values by y:

As we can see from the preceding expression, the cross-entropy will increase or penalize when the probability of prediction is close to 1 while the true output is 0 and vice versa. The same expression can be extended to K classes.
推薦閱讀
- Android項(xiàng)目開(kāi)發(fā)入門(mén)教程
- Production Ready OpenStack:Recipes for Successful Environments
- Banana Pi Cookbook
- MATLAB for Machine Learning
- jQuery for Designers Beginner's Guide Second Edition
- Emotional Intelligence for IT Professionals
- Mudbox 2013 Cookbook
- Mastering Embedded Linux Programming
- SFML Game Development
- JavaScript前端開(kāi)發(fā)基礎(chǔ)教程
- 金融商業(yè)數(shù)據(jù)分析:基于Python和SAS
- Node.js實(shí)戰(zhàn):分布式系統(tǒng)中的后端服務(wù)開(kāi)發(fā)
- Learning TypeScript
- 讀故事學(xué)編程:Python王國(guó)歷險(xiǎn)記
- PHP程序設(shè)計(jì)高級(jí)教程