官术网_书友最值得收藏!

Building the network

For this example, you'll define the following:

  • The input layer, which you should expect for each piece of MNIST data, as it tells the network the number of inputs
  • Hidden layers, as they recognize patterns in data and also connect the input layer to the output layer
  • The output layer, as it defines how the network learns and gives a label as the output for a given image, as follows:
# Defining the neural network
def build_model():
    model = Sequential()
    model.add(Dense(512, input_shape=(784,)))
    model.add(Activation('relu')) # An "activation" is just a non-linear function that is applied to the output
 # of the above layer. In this case, with a "rectified linear unit",
 # we perform clamping on all values below 0 to 0.
                           
    model.add(Dropout(0.2))   #With the help of Dropout helps we can protect the model from memorizing or "overfitting" the training data
    model.add(Dense(512))
    model.add(Activation('relu'))
    model.add(Dropout(0.2))
    model.add(Dense(10))
    model.add(Activation('softmax')) # This special "softmax" activation,
    #It also ensures that the output is a valid probability distribution,
    #Meaning that values obtained are all non-negative and sum up to 1.
    return model
#Building the model
model = build_model()
model.compile(optimizer='rmsprop',
          loss='categorical_crossentropy',
          metrics=['accuracy'])
主站蜘蛛池模板: 孝义市| 沙田区| 商南县| 平谷区| 邯郸县| 新乡县| 富平县| 新营市| 习水县| 高雄县| 青冈县| 福清市| 屏边| 灵璧县| 宜章县| 乐安县| 项城市| 奉化市| 云南省| 怀柔区| 赤峰市| 东城区| 庆阳市| 钟祥市| 灵山县| 大埔区| 台湾省| 乌海市| 三都| 增城市| 手游| 嘉义市| 石渠县| 琼海市| 手游| 新余市| 西昌市| 潜山县| 游戏| 闻喜县| 柘城县|