官术网_书友最值得收藏!

  • Hands-On Neural Networks
  • Leonardo De Marchi Laura Mitchell
  • 242字
  • 2021-06-24 14:00:17

FFNN Keras implementation

To implement our network in Keras, we will again use the Sequential model, but this time with one input neuron, three hidden units, and of course, one output unit, as we are doing a binary prediction:

  1. Let's import all of the necessary parts to create our network:
from keras.models import Sequential
from keras.layers.core import Dense, Dropout, Activation
from keras.optimizers import SGD
from sklearn.metrics import mean_squared_error
import os from keras.callbacks import ModelCheckpoint, Callback, EarlyStopping, TensorBoard
  1. Now, we need to define the first hidden layer of the network. To accomplish this, it's sufficient to specify the hidden layer's input—two in the XOR case. We can also specify the number of neurons in the hidden layer, which is as follows:
model = Sequential()
model.add(Dense(2, input_dim=2))
  1. As an activation function, we chose to use tanh:
model.add(Activation('tanh'))
  1. We then add another fully connected layer with one neuron, which, with a sigmoid activation function, will give us the output:
model.add(Dense(1))
model.add(Activation('sigmoid'))

  1. We again use SGD as the optimization method to train our neural network:
sgd = SGD(lr=0.1)
  1. We then compile our network, specifying that we want to use the MSE as loss function:
model.compile(loss='mse', optimizer=sgd)
  1. As the last step, we train our network, but this time we don't care about the batch size and we run it for 2 epochs:
model.fit(train_x[['x1', 'x2']], train_y,batch_size=1, epochs=2)
  1. As usual, we measure the MSE on the test set as follows:
pred = model.predict_proba(test_x)

print('NSE: ',mean_squared_error(test_y, pred))
主站蜘蛛池模板: 吉木萨尔县| 梨树县| 塘沽区| 什邡市| 汽车| 大洼县| 韶山市| 外汇| 滦平县| 文登市| 吉安市| 萍乡市| 海宁市| 湖口县| 黑龙江省| 洪湖市| 梓潼县| 城口县| 新津县| 安丘市| 洪洞县| 故城县| 龙山县| 平南县| 皋兰县| 乡宁县| 类乌齐县| 济源市| 阿合奇县| 工布江达县| 沾益县| 兴城市| 阳朔县| 鄯善县| 沙田区| 石嘴山市| 兴和县| 海南省| 翁源县| 邵阳县| 湾仔区|