- Mastering Machine Learning with R
- Cory Lesmeister
- 252字
- 2021-07-02 13:46:22
Logistic Regression
In the previous chapter, we took a look at using Ordinary Least Squares (OLS) to predict a quantitative outcome or, in other words, linear regression. It's now time to shift gears somewhat and examine how we can develop algorithms to predict qualitative outcomes. Such outcome variables could be binary (male versus female, purchase versus doesn't purchase, or a tumor is benign versus malignant) or multinomial categories (education level or eye color). Regardless of whether the outcome of interest is binary or multinomial, our task is to predict the probability of an observation belonging to a particular category of the outcome variable. In other words, we develop an algorithm to classify the observations.
To begin exploring classification problems, we'll discuss why applying the OLS linear regression isn't the correct technique and how the algorithms introduced in this chapter can solve these issues. We'll then look at the problem of predicting whether or not a banking customer is satisfied. To tackle this problem, we'll begin by building and interpreting a logistic regression model. We'll also start examining a univariate method to select features. Next, we'll turn to multivariate regression splines and discover ways to choose the best overall algorithm. This chapter will set the stage for more advanced machine learning methods in subsequent chapters.
We'll be covering the following topics in this chapter:
- Classification methods and linear regression
- Logistic regression
- Model training and evaluation
- Hands-On Intelligent Agents with OpenAI Gym
- 大數據管理系統
- 輕松學C#
- 反饋系統:多學科視角(原書第2版)
- IoT Penetration Testing Cookbook
- Visual C# 2008開發技術詳解
- 快學Flash動畫百例
- 視覺檢測技術及智能計算
- RPA(機器人流程自動化)快速入門:基于Blue Prism
- 3D Printing for Architects with MakerBot
- 統計學習理論與方法:R語言版
- Apache Superset Quick Start Guide
- OpenStack Cloud Computing Cookbook
- Hands-On Business Intelligence with Qlik Sense
- 玩轉PowerPoint