官术网_书友最值得收藏!

Understanding logistic regression

Despite its name, logistic regression can actually be used as a model for classification. It uses a logistic function (or sigmoid) to convert any real-valued input x into a predicted output value ? that takes values between 0 and 1, as shown in the following figure:

The logistic function

Rounding ? to the nearest integer effectively classifies the input as belonging either to class 0 or 1.

Of course, most often, our problems have more than one input or feature value, x. For example, the Iris dataset provides a total of four features. For the sake of simplicity, let's focus here on the first two features, sepal length--which we will call feature f1--and sepal width--which we will call f2. Using the tricks we learned when talking about linear regression, we know we can express the input x as a linear combination of the two features, f1 and f2:

However, in contrast to linear regression, we are not done yet. From the previous section we know that the sum of products would result in a real-valued, output--but we are interested in a categorical value, zero or one. This is where the logistic function comes in: it acts as a squashing function, σ, that compresses the range of possible output values to the range [0, 1]:

Because the output is always between 0 and 1, it can be interpreted as a probability. If we only have a single input variable x, the output value ? can be interpreted as the probability of x belonging to class 1.

Now let's apply this knowledge to the Iris dataset!

主站蜘蛛池模板: 新安县| 杭锦旗| 英超| 安龙县| 鞍山市| 从江县| 望都县| 舞钢市| 阜平县| 建宁县| 油尖旺区| 长沙县| 博罗县| 马关县| 江西省| 康定县| 肥乡县| 乌拉特前旗| 五华县| 日照市| 宜阳县| 滁州市| 邹城市| 柳州市| 石门县| 宜阳县| 武乡县| 日土县| 高唐县| 保亭| 台中市| 涟源市| 阿鲁科尔沁旗| 紫云| 长岛县| 河北省| 若尔盖县| 南皮县| 兴安县| 育儿| 崇明县|