官术网_书友最值得收藏!

Building a multi-layer neural network

What we've created in the previous recipe is actually the simplest form of an FNN: a neural network where the information flows only in one direction. For our next recipe, we will extend the number of hidden layers from one to multiple layers. Adding additional layers increases the power of a network to learn complex non-linear patterns. 

Figure 2.7: Two-layer neural network with i input variables,  n hidden units, and  m hidden units respectively, and a single output unit

As you can see in Figure 2-7, by adding an additional layer the number of connections (weights), also called trainable parameters, increases exponentially. In the next recipe, we will create a network with two hidden layers to predict wine quality. This is a regression task, so we will be using a linear activation for the output layer. For the hidden layers, we use ReLU activation functions. This recipe uses the Keras framework to implement the feed-forward network.

主站蜘蛛池模板: 镇江市| 修水县| 邛崃市| 虎林市| 安福县| 山丹县| 汉寿县| 余姚市| 睢宁县| 柯坪县| 长宁县| 松溪县| 黄浦区| 香格里拉县| 洛扎县| 三门峡市| 宁蒗| 罗定市| 法库县| 宜川县| 呼和浩特市| 贺州市| 哈巴河县| 庄河市| 登封市| 安义县| 寻乌县| 潮州市| 巨野县| 乌拉特后旗| 大田县| 广河县| 曲沃县| 宜黄县| 新巴尔虎右旗| 邯郸县| 荔波县| 公安县| 荔浦县| 同仁县| 堆龙德庆县|