官术网_书友最值得收藏!

The architecture of the generator 

The generator network in our dummy GAN is a simple feed-forward neural network with five layers: an input layer, three hidden layers, and an output layer. Let's take a closer look at the configuration of the generator (dummy) network:

 

The preceding table shows the configurations of the hidden layers, and also the input and output layers in the network.

The following diagram shows the flow of tensors and the input and output shapes of the tensors for each layer in the generator network:

The architecture of the generator network.

Let's discuss how this feed-forward neural network processes information during forward propagation of the data:

  • The input layer takes a 100-dimensional vector sampled from a Gaussian (normal) distribution and passes the tensor to the first hidden layer without any modifications.
  • The three hidden layers are dense layers with 500, 500, and 784 units, respectively. The first hidden layer (a dense layer) converts a tensor of a shape of (batch_size, 100) to a tensor of a shape of (batch_size, 500).
  • The second dense layer generates a tensor of a shape of (batch_size, 500).
  • The third hidden layer generates a tensor of a shape of (batch_size, 784).
  • In the last output layer, this tensor is reshaped from a shape of (batch_size, 784) to a shape of (batch_size, 28, 28). This means that our network will generate a batch of images, where one image will have a shape of (28, 28).
主站蜘蛛池模板: 南丰县| 浮梁县| 西乌珠穆沁旗| 南雄市| 雷山县| 文安县| 镶黄旗| 高清| 太和县| 乌拉特后旗| 太康县| 大港区| 达州市| 湟中县| 类乌齐县| 开远市| 郯城县| 临西县| 聂荣县| 桐乡市| 甘德县| 辰溪县| 海宁市| 巴马| 吉首市| 镇巴县| 黎川县| 葫芦岛市| 探索| 武清区| 蚌埠市| 富源县| 冀州市| 石楼县| 南丰县| 农安县| 桐庐县| 乌兰浩特市| 赣州市| 图们市| 淅川县|