官术网_书友最值得收藏!

The architecture of the generator 

The generator network in our dummy GAN is a simple feed-forward neural network with five layers: an input layer, three hidden layers, and an output layer. Let's take a closer look at the configuration of the generator (dummy) network:

 

The preceding table shows the configurations of the hidden layers, and also the input and output layers in the network.

The following diagram shows the flow of tensors and the input and output shapes of the tensors for each layer in the generator network:

The architecture of the generator network.

Let's discuss how this feed-forward neural network processes information during forward propagation of the data:

  • The input layer takes a 100-dimensional vector sampled from a Gaussian (normal) distribution and passes the tensor to the first hidden layer without any modifications.
  • The three hidden layers are dense layers with 500, 500, and 784 units, respectively. The first hidden layer (a dense layer) converts a tensor of a shape of (batch_size, 100) to a tensor of a shape of (batch_size, 500).
  • The second dense layer generates a tensor of a shape of (batch_size, 500).
  • The third hidden layer generates a tensor of a shape of (batch_size, 784).
  • In the last output layer, this tensor is reshaped from a shape of (batch_size, 784) to a shape of (batch_size, 28, 28). This means that our network will generate a batch of images, where one image will have a shape of (28, 28).
主站蜘蛛池模板: 合山市| 织金县| 嘉祥县| 安阳市| 东乡族自治县| 青田县| 北海市| 灌阳县| 巩留县| 军事| 济源市| 子长县| 大化| 凤山县| 鹤庆县| 兰溪市| 房山区| 武川县| 增城市| 台前县| 吉林市| 涟水县| 项城市| 前郭尔| 永泰县| 辉南县| 丹凤县| 丹江口市| 连州市| 宜宾县| 察雅县| 鹰潭市| 东宁县| 平乐县| 白河县| 平阴县| 三门峡市| 广东省| 丰县| 深圳市| 凭祥市|