- Generative Adversarial Networks Projects
- Kailash Ahirwar
- 238字
- 2021-07-02 13:38:50
The architecture of the generator network
The generator network contains five volumetric, fully convolutional layers with the following configuration:
- Convolutional layers: 5
- Filters: 512, 256, 128, 64, 1
- Kernel size: 4 x 4 x 4, 4 x 4 x 4, 4 x 4 x 4, 4 x 4 x 4, 4 x 4 x 4
- Strides: 1, 2, 2, 2, 2 or (1, 1), (2, 2), (2, 2), (2, 2), (2, 2)
- Batch normalization: Yes, Yes, Yes, Yes, No
- Activations: ReLU, ReLU, ReLU, ReLU, Sigmoid
- Pooling layers: No, No, No, No, No
- Linear layers: No, No, No, No, No
The input and output of the network are as follows:
- Input: A 200-dimensional vector sampled from a probabilistic latent space
- Output: A 3D image with a shape of 64x64x64
The architecture of the generator can be seen in the following image:

The flow of the tensors and the input and output shapes of the tensors for each layer in the discriminator network are shown in the following diagram. This will give you a better understanding of the network:

A fully convolutional network is a network without fully connected dense layers at the end of the network. Instead, it just consists of convolutional layers and can be end-to-end trained, like a convolutional network with fully connected layers. There are no pooling layers in a generator network.
推薦閱讀
- JavaScript實(shí)例自學(xué)手冊(cè)
- 輕松學(xué)Java Web開(kāi)發(fā)
- Google App Inventor
- VMware Performance and Capacity Management(Second Edition)
- 電腦上網(wǎng)直通車
- 現(xiàn)代機(jī)械運(yùn)動(dòng)控制技術(shù)
- Windows游戲程序設(shè)計(jì)基礎(chǔ)
- Cloudera Administration Handbook
- 精通數(shù)據(jù)科學(xué):從線性回歸到深度學(xué)習(xí)
- 統(tǒng)計(jì)挖掘與機(jī)器學(xué)習(xí):大數(shù)據(jù)預(yù)測(cè)建模和分析技術(shù)(原書(shū)第3版)
- Cloud Security Automation
- Mastering MongoDB 3.x
- 無(wú)人駕駛感知智能
- Xilinx FPGA高級(jí)設(shè)計(jì)及應(yīng)用
- 人工智能:智能人機(jī)交互