官术网_书友最值得收藏!

Steps to solve logistic regression using gradient descent

Putting together all the building blocks we've just covered, let's try to solve a binary logistic regression with two input features.

The basic steps to compute are:

  1. Calculate

  2. Calculate , the predicted output

  3. Calculate the cost function:

Say we have two input features, that is, two dimensions and m samples dataset. Therefore, the following would be the case:

  1. Weights and bias

  2. Therefore,, and,

  3. Calculate (average loss over all the examples)

  4. Calculating the derivative with regards to W1, W2 and that is , and, respectively

  5. Modify and as mentioned in the preceding gradient descent section

The pseudo code of the preceding m samples dataset are:

  1. Initialize the value of the learning rate, , and the number of epochs, e
  2. Loop over many number of epochs e' (where each time a full dataset will pass in batches)
  3. Initialize J (cost function) and b (bias) as 0, and for W1 and W2, you can go for random normal or xavier initialization (explained in the next section)

Here, a is , dw1 is , dw2 is and db is . Each iteration contains a loop iterating over m examples.

The pseudo code for the same is given here:

w1 = xavier initialization, w2 = xavier initialization, e = 100, α = 0.0001
for j → 1 to e :
J = 0, dw1 = 0, dw2 = 0, db = 0
for i → 1 to m :
z = w1x1[i] + w2x2[i] + b
a = σ(z)
J = J - [ y[i] log a + (1-y) log (1-a) ]
dw1 = dw1 + (a-y[i]) * x1[i]
dw2 = dw2 + (a-y[i]) * x2[i]
db = db + (a-y[i])
J = J / m
dw1 = dw1 / m
dw2 = dw2 / m
db = db / m
w1 = w1 - α * dw1
w2 = w2 - α * dw2
主站蜘蛛池模板: 鄯善县| 北辰区| 正阳县| 惠安县| 金秀| 平远县| 榆社县| 博白县| 大化| 万年县| 鹤庆县| 泰州市| 嘉峪关市| 寻乌县| 双流县| 揭西县| 镇巴县| 广汉市| 张掖市| 克拉玛依市| 垦利县| 垦利县| 晋城| 南通市| 肇东市| 汽车| 延安市| 施秉县| 珠海市| 五常市| 黔东| 会东县| 伊宁县| 商丘市| 尚志市| 本溪| 临朐县| 鄂伦春自治旗| 孟津县| 肇源县| 东方市|