首頁(yè) > 工業(yè)技術(shù) >
自動(dòng)化技術(shù)
> Hands-On Generative Adversarial Networks with Keras最新章節(jié)目錄
舉報(bào)

會(huì)員
Hands-On Generative Adversarial Networks with Keras
最新章節(jié):
Further reading
GenerativeAdversarialNetworks(GANs)haverevolutionizedthefieldsofmachinelearninganddeeplearning.ThisbookwillbeyourfirststeptowardsunderstandingGANarchitecturesandtacklingthechallengesinvolvedintrainingthem.Thisbookopenswithanintroductiontodeeplearningandgenerativemodels,andtheirapplicationsinartificialintelligence(AI).Youwillthenlearnhowtobuild,evaluate,andimproveyourfirstGANwiththehelpofeasy-to-followexamples.ThenextfewchapterswillguideyouthroughtrainingaGANmodeltoproduceandimprovehigh-resolutionimages.YouwillalsolearnhowtoimplementconditionalGANsthatgiveyoutheabilitytocontrolcharacteristicsofGANoutputs.YouwillbuildonyourknowledgefurtherbyexploringanewtrainingmethodologyforprogressivegrowingofGANs.Movingon,you'llgaininsightsintostate-of-the-artmodelsinimagesynthesis,speechenhancement,andnaturallanguagegenerationusingGANs.Inadditiontothis,you'llbeabletoidentifyGANsampleswithTequilaGAN.Bytheendofthisbook,youwillbewell-versedwiththelatestadvancementsintheGANframeworkusingvariousexamplesanddatasets,andyouwillhavetheskillsyouneedtoimplementGANarchitecturesforseveraltasksanddomains,includingcomputervision,naturallanguageprocessing(NLP),andaudioprocessing.ForewordbyTing-ChunWang,SeniorResearchScientist,NVIDIA
目錄(216章)
倒序
- coverpage
- Title Page
- Copyright and Credits
- Hands-On Generative Adversarial Networks with Keras
- About Packt
- Why subscribe?
- Packt.com
- Foreword
- Contributors
- About the author
- About the reviewer
- Packt is searching for authors like you
- Preface
- Who this book is for
- What this book covers
- To get the most out of this book
- Download the example code files
- Download the color images
- Conventions used
- Get in touch
- Reviews
- Section 1: Introduction and Environment Setup
- Deep Learning Basics and Environment Setup
- Deep learning basics
- Artificial Neural Networks (ANNs)
- The parameter estimation
- Backpropagation
- Loss functions
- L1 loss
- L2 loss
- Categorical crossentropy loss
- Non-linearities
- Sigmoid
- Tanh
- ReLU
- A fully connected layer
- The convolution layer
- The max pooling layer
- Deep learning environment setup
- Installing Anaconda and Python
- Setting up a virtual environment in Anaconda
- Installing TensorFlow
- Installing Keras
- Installing data visualization and machine learning libraries
- The matplotlib library
- The Jupyter library
- The scikit-learn library
- NVIDIA's CUDA Toolkit and cuDNN
- The deep learning environment test
- Summary
- Introduction to Generative Models
- Discriminative and generative models compared
- Comparing discriminative and generative models
- Generative models
- Autoregressive models
- Variational autoencoders
- Reversible flows
- Generative adversarial networks
- GANs – building blocks
- The discriminator
- The generator
- Real and fake data
- Random noise
- Discriminator and generator loss
- GANs – strengths and weaknesses
- Summary
- Section 2: Training GANs
- Implementing Your First GAN
- Technical requirements
- Imports
- Implementing a Generator and Discriminator
- Generator
- Discriminator
- Auxiliary functions
- Training your GAN
- Summary
- Further reading
- Evaluating Your First GAN
- The evaluation of GANs
- Image quality
- Image variety
- Domain specifications
- Qualitative methods
- k-nearest neighbors
- Mode analysis
- Other methods
- Quantitative methods
- The Inception score
- The Frechét Inception Distance
- Precision Recall and the F1 Score
- GANs and the birthday paradox
- Summary
- Improving Your First GAN
- Technical requirements
- Challenges in training GANs
- Mode collapse and mode drop
- Training instability
- Sensitivity to hyperparameter initialization
- Vanishing gradients
- Tricks of the trade
- Tracking failure
- Working with labels
- Working with discrete inputs
- Adding noise
- Input normalization
- Modified objective function
- Distribute latent vector
- Weight normalization
- Avoid sparse gradients
- Use a different optimizer
- Learning rate schedule
- GAN model architectures
- ResNet GAN
- GAN algorithms and loss functions
- Least Squares GAN
- Wasserstein GAN
- Wasserstein GAN with gradient penalty
- Relativistic GAN
- Summary
- Section 3: Application of GANs in Computer Vision Natural Language Processing and Audio
- Progressive Growing of GANs
- Technical requirements
- Progressive Growing of GANs
- Increasing variation using minibatch standard deviation
- Normalization in the generator and the discriminator
- Pixelwise feature vector normalization in the generator
- Experimental setup
- Training
- Helper functions
- Initializations
- Training loops
- Model implementation
- Custom layers
- The discriminator
- The generator
- GANs
- Summary
- Generation of Discrete Sequences Using GANs
- Technical requirements
- Natural language generation with GANs
- Experimental setup
- Data
- Auxiliary training functions
- Training
- Imports and global variables
- Initializations
- Training loop
- Logging
- Model implementation
- Helper functions
- Discriminator
- Generator
- Inference
- Model trained on words
- Model trained on characters
- Summary
- Text-to-Image Synthesis with GANs
- Technical Requirements
- Text-to-image synthesis
- Experimental setup
- Data utils
- Logging utils
- Training
- Initial setup
- The training loop
- Model implementation
- Wrapper
- Discriminator
- Generator
- Improving the baseline model
- Training
- Inference
- Sampling the generator
- Interpolation in the Latent Space
- Interpolation in the text-embedding space
- Inferencing with arithmetic in the text-embedding space
- Summary
- TequilaGAN - Identifying GAN Samples
- Technical requirements
- Identifying GAN samples
- Related work
- Feature extraction
- Centroid
- Slope
- Metrics
- Jensen-Shannon divergence
- Kolgomorov-Smirnov Two-Sample test
- Experiments
- MNIST
- Summary
- References
- Whats next in GANs
- What we've GANed so far
- Generative models
- Architectures
- Loss functions
- Tricks of the trade
- Implementations
- Unanswered questions in GANs
- Are some losses better than others?
- Do GANs do distribution learning?
- All about that inductive bias
- How can you kill a GAN?
- Artistic GANs
- Visual arts
- GANGogh
- Image inpainting
- Vid2Vid
- GauGAN
- Sonic arts
- MuseGAN
- GANSynth
- Recent and yet-to-be-explored GAN topics
- Summary
- Closing remarks
- Further reading 更新時(shí)間:2021-06-24 14:34:30
推薦閱讀
- Word 2003、Excel 2003、PowerPoint 2003上機(jī)指導(dǎo)與練習(xí)
- 自動(dòng)控制工程設(shè)計(jì)入門
- 嵌入式系統(tǒng)應(yīng)用
- 基于LPC3250的嵌入式Linux系統(tǒng)開發(fā)
- 自動(dòng)控制原理
- Java開發(fā)技術(shù)全程指南
- 計(jì)算機(jī)網(wǎng)絡(luò)應(yīng)用基礎(chǔ)
- 精通特征工程
- 現(xiàn)代傳感技術(shù)
- 工業(yè)機(jī)器人操作與編程
- Nginx高性能Web服務(wù)器詳解
- 大數(shù)據(jù)技術(shù)基礎(chǔ):基于Hadoop與Spark
- 空間機(jī)械臂建模、規(guī)劃與控制
- Mastering OpenStack(Second Edition)
- 工業(yè)機(jī)器人入門實(shí)用教程
- 筆記本電腦使用與維護(hù)
- 穿越計(jì)算機(jī)的迷霧
- Moodle 2.0 Course Conversion(Second Edition)
- QTP自動(dòng)化測(cè)試實(shí)踐
- 大數(shù)據(jù):從基礎(chǔ)理論到最佳實(shí)踐
- SolarWinds Server & Application Monitor:Deployment and Administration
- 歐姆龍CP1系列PLC原理與應(yīng)用
- Intel Edison Projects
- Photoshop CS6婚紗數(shù)碼照片處理達(dá)人秘笈
- C# Machine Learning Projects
- Deep Learning By Example
- Learn Ansible
- 數(shù)據(jù)庫(kù)系統(tǒng)管理實(shí)務(wù)
- INSTANT Chef Starter
- 文檔之美