- Deep Learning Quick Reference
- Mike Bernico
- 195字
- 2021-06-24 18:40:10
Building a deep neural network in Keras
Changing our model is as easy as redefining our previous build_network() function. Our input layer will stay the same because our input hasn't changed. Likewise, the output layer should remain the same.
I'm going to add parameters to our network by adding additional hidden layers. I hope that by adding these hidden layers, our network can learn more complicated relationships between the input and output. I am going to start by adding four additional hidden layers; the first three will have 32 neurons and the fourth will have 16. Here's what it will look like:

And here's the associated code for building the model in Keras:
def build_network(input_features=None):
inputs = Input(shape=(input_features,), name="input")
x = Dense(32, activation='relu', name="hidden1")(inputs)
x = Dense(32, activation='relu', name="hidden2")(x)
x = Dense(32, activation='relu', name="hidden3")(x)
x = Dense(32, activation='relu', name="hidden4")(x)
x = Dense(16, activation='relu', name="hidden5")(x)
prediction = Dense(1, activation='linear', name="final")(x)
model = Model(inputs=inputs, outputs=prediction)
model.compile(optimizer='adam', loss='mean_absolute_error')
return model
As promised, very little of our code has changed. I've bolded the additional lines. The rest of our code can stay the same; however, you often have to train longer (for more epochs) as network complexity increases.
- TestStand工業(yè)自動化測試管理(典藏版)
- 數(shù)據(jù)運(yùn)營之路:掘金數(shù)據(jù)化時代
- 機(jī)器自動化控制器原理與應(yīng)用
- 變頻器、軟啟動器及PLC實用技術(shù)260問
- 空間機(jī)械臂建模、規(guī)劃與控制
- R Data Analysis Projects
- 大數(shù)據(jù)素質(zhì)讀本
- Visual Basic項目開發(fā)案例精粹
- 算法設(shè)計與分析
- 精通ROS機(jī)器人編程(原書第2版)
- Java求職寶典
- Learning Couchbase
- CPLD/FPGA技術(shù)應(yīng)用
- Mastering Android Game Development with Unity
- 實戰(zhàn)突擊