官术网_书友最值得收藏!

Building a multi-layer neural network

What we've created in the previous recipe is actually the simplest form of an FNN: a neural network where the information flows only in one direction. For our next recipe, we will extend the number of hidden layers from one to multiple layers. Adding additional layers increases the power of a network to learn complex non-linear patterns. 

Figure 2.7: Two-layer neural network with i input variables,  n hidden units, and  m hidden units respectively, and a single output unit

As you can see in Figure 2-7, by adding an additional layer the number of connections (weights), also called trainable parameters, increases exponentially. In the next recipe, we will create a network with two hidden layers to predict wine quality. This is a regression task, so we will be using a linear activation for the output layer. For the hidden layers, we use ReLU activation functions. This recipe uses the Keras framework to implement the feed-forward network.

主站蜘蛛池模板: 施秉县| 房山区| 镇原县| 土默特右旗| 南丹县| 双流县| 湘潭市| 广丰县| 武安市| 唐山市| 凯里市| 海门市| 报价| 平塘县| 肇州县| 定襄县| 营山县| 咸丰县| 铁岭市| 合作市| 永丰县| 子长县| 孙吴县| 林西县| 潞城市| 东平县| 会宁县| 临武县| 曲周县| 台北县| 新绛县| 张家港市| 全州县| 江达县| SHOW| 乐平市| 永顺县| 平果县| 苗栗市| 浮山县| 西贡区|