官术网_书友最值得收藏!

Steps to solve logistic regression using gradient descent

Putting together all the building blocks we've just covered, let's try to solve a binary logistic regression with two input features.

The basic steps to compute are:

  1. Calculate

  2. Calculate , the predicted output

  3. Calculate the cost function:

Say we have two input features, that is, two dimensions and m samples dataset. Therefore, the following would be the case:

  1. Weights and bias

  2. Therefore,, and,

  3. Calculate (average loss over all the examples)

  4. Calculating the derivative with regards to W1, W2 and that is , and, respectively

  5. Modify and as mentioned in the preceding gradient descent section

The pseudo code of the preceding m samples dataset are:

  1. Initialize the value of the learning rate, , and the number of epochs, e
  2. Loop over many number of epochs e' (where each time a full dataset will pass in batches)
  3. Initialize J (cost function) and b (bias) as 0, and for W1 and W2, you can go for random normal or xavier initialization (explained in the next section)

Here, a is , dw1 is , dw2 is and db is . Each iteration contains a loop iterating over m examples.

The pseudo code for the same is given here:

w1 = xavier initialization, w2 = xavier initialization, e = 100, α = 0.0001
for j → 1 to e :
J = 0, dw1 = 0, dw2 = 0, db = 0
for i → 1 to m :
z = w1x1[i] + w2x2[i] + b
a = σ(z)
J = J - [ y[i] log a + (1-y) log (1-a) ]
dw1 = dw1 + (a-y[i]) * x1[i]
dw2 = dw2 + (a-y[i]) * x2[i]
db = db + (a-y[i])
J = J / m
dw1 = dw1 / m
dw2 = dw2 / m
db = db / m
w1 = w1 - α * dw1
w2 = w2 - α * dw2
主站蜘蛛池模板: 梓潼县| 房产| 冕宁县| 堆龙德庆县| 响水县| 潮州市| 石景山区| 柯坪县| 普陀区| 玉溪市| 亚东县| 定兴县| 资中县| 凤庆县| 家居| 安岳县| 枣强县| 庆安县| 南召县| 洛阳市| 城市| 金川县| 刚察县| 襄汾县| 通化县| 吉木萨尔县| 迁安市| 平阳县| 广东省| 合水县| 古交市| 方正县| 清流县| 黎城县| 本溪市| 河北省| 峡江县| 临沭县| 陇南市| 郑州市| 德阳市|