- Mastering Machine Learning with R(Second Edition)
- Cory Lesmeister
- 251字
- 2021-07-09 18:23:57
Classification methods and linear regression
So, why can't we just use the least square regression method that we learned in the previous chapter for a qualitative outcome? Well, as it turns out, you can, but at your own risk. Let's assume for a second that you have an outcome that you are trying to predict and it has three different classes: mild, moderate, and severe. You and your colleagues also assume that the difference between mild and moderate and moderate and severe is an equivalent measure and a linear relationship. You can create a dummy variable where 0 is equal to mild, 1 is equal to moderate, and 2 is equal to severe. If you have reason to believe this, then linear regression might be an acceptable solution. However, qualitative assessments such as the previous ones might lend themselves to a high level of measurement error that can bias the OLS. In most business problems, there is no scientifically acceptable way to convert a qualitative response to one that is quantitative. What if you have a response with two outcomes, say fail and pass? Again, using the dummy variable approach, we could code the fail outcome as 0 and the pass outcome as 1. Using linear regression, we could build a model where the predicted value is the probability of an observation of pass or fail. However, the estimates of Y in the model will most likely exceed the probability constraints of [0,1] and thus be a bit difficult to interpret.
- MySQL高可用解決方案:從主從復制到InnoDB Cluster架構
- Mockito Cookbook
- Microsoft Power BI數據可視化與數據分析
- INSTANT Apple iBooks How-to
- 數據科學實戰指南
- Hadoop集群與安全
- 菜鳥學SPSS數據分析
- R Machine Learning Essentials
- 大數據數學基礎(R語言描述)
- Oracle 11g+ASP.NET數據庫系統開發案例教程
- 數據庫原理與設計實驗教程(MySQL版)
- MySQL數據庫應用與管理
- Cognitive Computing with IBM Watson
- 大數據網絡傳播模型和算法
- Building Multicopter Video Drones