官术网_书友最值得收藏!

Logistic regression for classification

In the previous section, we learned how to predict. There's another common task in ML: the task of classification. Separating dogs from cats and spam from not spam, or even identifying the different objects in a room or scene—all of these are classification tasks. 

Logistic regression is an old classification technique. It provides the probability of an event taking place, given an input value. The events are represented as categorical dependent variables, and the probability of a particular dependent variable being 1 is given using the logit function:

Before going into the details of how we can use logistic regression for classification, let's examine the logit function (also called the sigmoid function because of its S-shaped curve). The following diagram shows the logit function and its derivative varies with respect to the input X, the Sigmoidal function (blue) and its derivative (orange):

A few important things to note from this diagram are the following:

  • The value of sigmoid (and hence Ypred) lies between (01)
  • The derivative of the sigmoid is highest when WTX + b = 0.0 and the highest value of the derivative is just 0.25 (the sigmoid at same place has a value 0.5)
  • The slope by which the sigmoid varies depends on the weights, and the position where we'll have the peak of derivative depends on the bias

I would suggest you play around with the Sigmoid_function.ipynb program available at this book's GitHub repository, to get a feel of how the sigmoid function changes as the weight and bias changes. 

主站蜘蛛池模板: 区。| 景洪市| 襄汾县| 许昌市| 正镶白旗| 吴旗县| 永春县| 长阳| 石棉县| 安福县| 奇台县| 高青县| 万盛区| 固安县| 古田县| 若羌县| 海林市| 红原县| 华阴市| 辉县市| 新兴县| 兴隆县| 原阳县| 平阴县| 于都县| 宁波市| 固镇县| 湖北省| 周口市| 英山县| 昭平县| 蒲江县| 仙桃市| 齐齐哈尔市| 富平县| 天津市| 鸡泽县| 肃宁县| 白朗县| 乐山市| 郑州市|