官术网_书友最值得收藏!

Building a multi-layer neural network

What we've created in the previous recipe is actually the simplest form of an FNN: a neural network where the information flows only in one direction. For our next recipe, we will extend the number of hidden layers from one to multiple layers. Adding additional layers increases the power of a network to learn complex non-linear patterns. 

Figure 2.7: Two-layer neural network with i input variables,  n hidden units, and  m hidden units respectively, and a single output unit

As you can see in Figure 2-7, by adding an additional layer the number of connections (weights), also called trainable parameters, increases exponentially. In the next recipe, we will create a network with two hidden layers to predict wine quality. This is a regression task, so we will be using a linear activation for the output layer. For the hidden layers, we use ReLU activation functions. This recipe uses the Keras framework to implement the feed-forward network.

主站蜘蛛池模板: 凤山县| 乐亭县| 阿合奇县| 汶上县| 昭苏县| 抚州市| 阿瓦提县| 龙岩市| 合川市| 高邑县| 滨海县| 仙桃市| 民乐县| 齐齐哈尔市| 东丽区| 镇江市| 柳河县| 休宁县| 大邑县| 玉环县| 浪卡子县| 改则县| 奇台县| 宾川县| 黄大仙区| 航空| 偏关县| 岱山县| 乡宁县| 五峰| 三穗县| 荔浦县| 金门县| 沈阳市| 庆元县| 大厂| 尼木县| 平邑县| 嘉定区| 富平县| 张家口市|