官术网_书友最值得收藏!

Logistic regression for classification

In the previous section, we learned how to predict. There's another common task in ML: the task of classification. Separating dogs from cats and spam from not spam, or even identifying the different objects in a room or scene—all of these are classification tasks. 

Logistic regression is an old classification technique. It provides the probability of an event taking place, given an input value. The events are represented as categorical dependent variables, and the probability of a particular dependent variable being 1 is given using the logit function:

Before going into the details of how we can use logistic regression for classification, let's examine the logit function (also called the sigmoid function because of its S-shaped curve). The following diagram shows the logit function and its derivative varies with respect to the input X, the Sigmoidal function (blue) and its derivative (orange):

A few important things to note from this diagram are the following:

  • The value of sigmoid (and hence Ypred) lies between (01)
  • The derivative of the sigmoid is highest when WTX + b = 0.0 and the highest value of the derivative is just 0.25 (the sigmoid at same place has a value 0.5)
  • The slope by which the sigmoid varies depends on the weights, and the position where we'll have the peak of derivative depends on the bias

I would suggest you play around with the Sigmoid_function.ipynb program available at this book's GitHub repository, to get a feel of how the sigmoid function changes as the weight and bias changes. 

主站蜘蛛池模板: 乌兰察布市| 大悟县| 新疆| 宁明县| 巩义市| 灵宝市| 九龙坡区| 宜城市| 余干县| 新宁县| 长垣县| 长春市| 静安区| 长顺县| 苏尼特左旗| 梨树县| 黄陵县| 宜黄县| 尉犁县| 雅江县| 衡山县| 滕州市| 平陆县| 从江县| 渑池县| 蒙阴县| 绥滨县| 永靖县| 涟源市| 徐汇区| 原平市| 五家渠市| 红原县| 大竹县| 凤凰县| 沂水县| 六盘水市| 子洲县| 灌阳县| 泽普县| 乾安县|