官术网_书友最值得收藏!

Building a multi-layer neural network

What we've created in the previous recipe is actually the simplest form of an FNN: a neural network where the information flows only in one direction. For our next recipe, we will extend the number of hidden layers from one to multiple layers. Adding additional layers increases the power of a network to learn complex non-linear patterns. 

Figure 2.7: Two-layer neural network with i input variables,  n hidden units, and  m hidden units respectively, and a single output unit

As you can see in Figure 2-7, by adding an additional layer the number of connections (weights), also called trainable parameters, increases exponentially. In the next recipe, we will create a network with two hidden layers to predict wine quality. This is a regression task, so we will be using a linear activation for the output layer. For the hidden layers, we use ReLU activation functions. This recipe uses the Keras framework to implement the feed-forward network.

主站蜘蛛池模板: 松阳县| 定州市| 昂仁县| 巨野县| 卢湾区| 荥阳市| 阿图什市| 莎车县| 马关县| 鹿邑县| 探索| 驻马店市| 康平县| 安塞县| 乌什县| 南乐县| 九龙县| 襄垣县| 平邑县| 扶余县| 兴文县| 洛浦县| 陇川县| 凤翔县| 武威市| 达日县| 鄂托克旗| 芜湖市| 分宜县| 大悟县| 包头市| 南乐县| 武平县| 沈阳市| 呼和浩特市| 东辽县| 勐海县| 嘉禾县| 富锦市| 长春市| 昭平县|