- Healthcare Analytics Made Simple
- Vikas (Vik) Kumar
- 187字
- 2021-07-23 17:18:39
Corresponding machine learning algorithm – the Naive Bayes Classifier
In the preceding example, we showed you how to calculate a post-test probability given a pretest probability, a likelihood, and a test result. The machine learning algorithm known as the Naive Bayes Classifier does this for every feature sequentially for a given observation. For example, in the preceding example, the post-test probability was 14.3%. Let's pretend that the patient now has a troponin drawn and it is elevated. 14.3% now becomes the pretest probability, and a new post-test probability is calculated based on the contingency table for troponin and MI, where the contingency tables are obtained from the training data. This is continued until all the features are exhausted. Again, the key assumption is that each feature is independent of all others. For the classifier, the category (outcome) having the highest post-test probability is assigned to the observation.
The Naive Bayes Classifier is popular for a select group of applications. Its advantages include high interpretability, robustness to missing data, and ease/speed for training and predicting. However, its assumptions make the model unable to compete with more state-of-the-art algorithms.
- Hands-On Internet of Things with MQTT
- 中文版Photoshop CS5數碼照片處理完全自學一本通
- Go Machine Learning Projects
- Hands-On Cybersecurity with Blockchain
- 大數據安全與隱私保護
- 網絡組建與互聯
- 項目管理成功利器Project 2007全程解析
- Mastering ServiceNow Scripting
- Docker on Amazon Web Services
- 激光選區熔化3D打印技術
- Linux內核精析
- R Machine Learning Projects
- Learn QGIS
- 電腦上網入門
- Hands-On Deep Learning with Go