官术网_书友最值得收藏!

Implementing a single-layer neural network

Now we can move on to neural networks. We will start by implementing the simplest form of a neural network: a single-layer neural network. The difference from a perceptron is that the computations are done by multiple units (neurons), hence a network. As you may expect, adding more units will increase the number of problems that can be solved. The units perform their computations separately and are stacked in a layer; we call this layer the hidden layer. Therefore, we call the units stacked in this layer the hidden units. For now, we will only consider a single hidden layer. The output layer performs as a perceptron. This time, as input we have the hidden units in the hidden layer instead of the input variables:

Figure 2.4: Single-layer neural network with two input variables,  n hidden units, and a single output unit

In our implementation of the perceptron, we've used a unit step function to determine the class. In the next recipe, we will use a non-linear activation function sigmoid for the hidden units and the output function. By replacing the step function with a non-linear activation function, the network will be able to uncover non-linear patterns as well. More on this later in the Activation functions section. In the backward pass, we use the derivative of the sigmoid to update the weights.

In the following recipe, we will classify two non-linearly separable classes with NumPy.

主站蜘蛛池模板: 县级市| 商洛市| 东至县| 浏阳市| 融水| 麦盖提县| 香港 | 乐东| 道真| 五华县| 武城县| 贵港市| 神木县| 吴堡县| 永新县| 阜康市| 体育| 蓬莱市| 壶关县| 青海省| 固阳县| 沁阳市| 钟山县| 平凉市| 武陟县| 禹城市| 富阳市| 北票市| 右玉县| 那曲县| 本溪市| 夏津县| 延津县| 乐东| 都匀市| 宜城市| 旬邑县| 日喀则市| 刚察县| 会东县| 铜陵市|