- Hands-On Machine Learning with Microsoft Excel 2019
- Julio Cesar Rodriguez Martino
- 173字
- 2021-06-24 15:10:59
Building the confusion matrix
Let's now think about a binary classification problem. We have a set of samples belonging to two classes: YES or NO. We can build a machine learning model that outputs a class for each input set of variables. By testing our model on 200 samples, we will get the following results:

There are four elements to the confusion matrix:
- True positives (TP): The number of times that the model predicts YES and the actual value is YES. In our example, this is 100 times.
- True negatives (TN): The number of times that the model predicts NO and the actual value is NO. In our example, this is 60 times.
- False positives (FP): The number of times that the model predicts YES and the actual value is NO. In our example, this is 15 times.
- False negatives (FN): The number of times that the model predicts NO and the actual value is YES. In this example, this is 25 times.
Then, we calculate the confusion matrix in the following equation:

推薦閱讀
- PyTorch深度學(xué)習(xí)實(shí)戰(zhàn):從新手小白到數(shù)據(jù)科學(xué)家
- 算法競賽入門經(jīng)典:習(xí)題與解答
- WS-BPEL 2.0 Beginner's Guide
- Spark大數(shù)據(jù)分析實(shí)戰(zhàn)
- Power BI商業(yè)數(shù)據(jù)分析完全自學(xué)教程
- 數(shù)據(jù)庫技術(shù)實(shí)用教程
- 企業(yè)主數(shù)據(jù)管理實(shí)務(wù)
- 大數(shù)據(jù)分析:R基礎(chǔ)及應(yīng)用
- 基于數(shù)據(jù)發(fā)布的隱私保護(hù)模型研究
- 數(shù)據(jù)會(huì)說話:活用數(shù)據(jù)表達(dá)、說服與決策
- Kafka權(quán)威指南(第2版)
- 信息技術(shù)導(dǎo)論
- 大數(shù)據(jù)原理與技術(shù)
- TypeScript Microservices
- Hadoop與大數(shù)據(jù)挖掘