官术网_书友最值得收藏!

Building a multi-layer neural network

What we've created in the previous recipe is actually the simplest form of an FNN: a neural network where the information flows only in one direction. For our next recipe, we will extend the number of hidden layers from one to multiple layers. Adding additional layers increases the power of a network to learn complex non-linear patterns. 

Figure 2.7: Two-layer neural network with i input variables,  n hidden units, and  m hidden units respectively, and a single output unit

As you can see in Figure 2-7, by adding an additional layer the number of connections (weights), also called trainable parameters, increases exponentially. In the next recipe, we will create a network with two hidden layers to predict wine quality. This is a regression task, so we will be using a linear activation for the output layer. For the hidden layers, we use ReLU activation functions. This recipe uses the Keras framework to implement the feed-forward network.

主站蜘蛛池模板: 明光市| 北海市| 栾川县| 綦江县| 淮北市| 庆城县| 中方县| 望江县| 扎兰屯市| 三河市| 宁蒗| 那坡县| 抚顺市| 边坝县| 崇文区| 怀化市| 克东县| 汉沽区| 三门峡市| 博白县| 灵武市| 广平县| 承德市| 云安县| 河北省| 阜新市| 高安市| 彰化市| 乾安县| 阿荣旗| 颍上县| 五大连池市| 宝鸡市| 彭山县| 咸宁市| 荥经县| 双柏县| 大新县| 云梦县| 淮南市| 灌云县|