官术网_书友最值得收藏!

Understanding logistic regression

Despite its name, logistic regression can actually be used as a model for classification. It uses a logistic function (or sigmoid) to convert any real-valued input x into a predicted output value ? that takes values between 0 and 1, as shown in the following figure:

The logistic function

Rounding ? to the nearest integer effectively classifies the input as belonging either to class 0 or 1.

Of course, most often, our problems have more than one input or feature value, x. For example, the Iris dataset provides a total of four features. For the sake of simplicity, let's focus here on the first two features, sepal length--which we will call feature f1--and sepal width--which we will call f2. Using the tricks we learned when talking about linear regression, we know we can express the input x as a linear combination of the two features, f1 and f2:

However, in contrast to linear regression, we are not done yet. From the previous section we know that the sum of products would result in a real-valued, output--but we are interested in a categorical value, zero or one. This is where the logistic function comes in: it acts as a squashing function, σ, that compresses the range of possible output values to the range [0, 1]:

Because the output is always between 0 and 1, it can be interpreted as a probability. If we only have a single input variable x, the output value ? can be interpreted as the probability of x belonging to class 1.

Now let's apply this knowledge to the Iris dataset!

主站蜘蛛池模板: 喀喇| 渝北区| 铜山县| 措美县| 桂东县| 巧家县| 邛崃市| 聂荣县| 昌乐县| 二手房| 大冶市| 绥滨县| 青海省| 分宜县| 皮山县| 霍山县| 彰化市| 宜州市| 松江区| 清河县| 连城县| 郯城县| 禹州市| 东源县| 开阳县| 青河县| 车险| 南召县| 福安市| 江山市| 黎川县| 玉山县| 广德县| 朝阳市| 巴林左旗| 甘谷县| 亚东县| 类乌齐县| 黔西县| 渝中区| 钟山县|