官术网_书友最值得收藏!

Implementing a single-layer neural network

Now we can move on to neural networks. We will start by implementing the simplest form of a neural network: a single-layer neural network. The difference from a perceptron is that the computations are done by multiple units (neurons), hence a network. As you may expect, adding more units will increase the number of problems that can be solved. The units perform their computations separately and are stacked in a layer; we call this layer the hidden layer. Therefore, we call the units stacked in this layer the hidden units. For now, we will only consider a single hidden layer. The output layer performs as a perceptron. This time, as input we have the hidden units in the hidden layer instead of the input variables:

Figure 2.4: Single-layer neural network with two input variables,  n hidden units, and a single output unit

In our implementation of the perceptron, we've used a unit step function to determine the class. In the next recipe, we will use a non-linear activation function sigmoid for the hidden units and the output function. By replacing the step function with a non-linear activation function, the network will be able to uncover non-linear patterns as well. More on this later in the Activation functions section. In the backward pass, we use the derivative of the sigmoid to update the weights.

In the following recipe, we will classify two non-linearly separable classes with NumPy.

主站蜘蛛池模板: 禹州市| 永嘉县| 凤山县| 华安县| 彭阳县| 池州市| 四会市| 察雅县| 甘德县| 龙游县| 连州市| 蚌埠市| 龙山县| 临潭县| 诏安县| 孝昌县| 都匀市| 化州市| 扎兰屯市| 楚雄市| 阿克苏市| 冀州市| 定远县| 白城市| 东兰县| 平顶山市| 荣成市| 富源县| 平原县| 庆元县| 沧州市| 石柱| 浪卡子县| 同德县| 麻城市| 长治市| 临夏县| 开封市| 景东| 永济市| 桓仁|