- Machine Learning for OpenCV
- Michael Beyeler
- 272字
- 2021-07-02 19:47:25
Understanding logistic regression
Despite its name, logistic regression can actually be used as a model for classification. It uses a logistic function (or sigmoid) to convert any real-valued input x into a predicted output value ? that takes values between 0 and 1, as shown in the following figure:

Rounding ? to the nearest integer effectively classifies the input as belonging either to class 0 or 1.
Of course, most often, our problems have more than one input or feature value, x. For example, the Iris dataset provides a total of four features. For the sake of simplicity, let's focus here on the first two features, sepal length--which we will call feature f1--and sepal width--which we will call f2. Using the tricks we learned when talking about linear regression, we know we can express the input x as a linear combination of the two features, f1 and f2:

However, in contrast to linear regression, we are not done yet. From the previous section we know that the sum of products would result in a real-valued, output--but we are interested in a categorical value, zero or one. This is where the logistic function comes in: it acts as a squashing function, σ, that compresses the range of possible output values to the range [0, 1]:

Now let's apply this knowledge to the Iris dataset!
- Dynamics 365 for Finance and Operations Development Cookbook(Fourth Edition)
- 解構產品經理:互聯網產品策劃入門寶典
- INSTANT Weka How-to
- 深入淺出Android Jetpack
- 征服RIA
- 精通Scrapy網絡爬蟲
- 零基礎學Java(第4版)
- GeoServer Beginner's Guide(Second Edition)
- Big Data Analytics
- HTML5入門經典
- iOS編程基礎:Swift、Xcode和Cocoa入門指南
- Learning Laravel's Eloquent
- Node.js:來一打 C++ 擴展
- Unity 2017 Mobile Game Development
- JQuery風暴:完美用戶體驗