舉報

會員
Hands-On Deep Learning Architectures with Python
Deeplearningarchitecturesarecomposedofmultilevelnonlinearoperationsthatrepresenthigh-levelabstractions;thisallowsyoutolearnusefulfeaturerepresentationsfromthedata.Thisbookwillhelpyoulearnandimplementdeeplearningarchitecturestoresolvevariousdeeplearningresearchproblems.Hands-OnDeepLearningArchitectureswithPythonexplainstheessentiallearningalgorithmsusedfordeepandshallowarchitectures.Packedwithpracticalimplementationsandideastohelpyoubuildefficientartificialintelligencesystems(AI),thisbookwillhelpyoulearnhowneuralnetworksplayamajorroleinbuildingdeeparchitectures.Youwillunderstandvariousdeeplearningarchitectures(suchasAlexNet,VGGNet,GoogleNet)witheasy-to-followcodeanddiagrams.Inadditiontothis,thebookwillalsoguideyouinbuildingandtrainingvariousdeeparchitecturessuchastheBoltzmannmechanism,autoencoders,convolutionalneuralnetworks(CNNs),recurrentneuralnetworks(RNNs),naturallanguageprocessing(NLP),GAN,andmore—allwithpracticalimplementations.Bytheendofthisbook,youwillbeabletoconstructdeepmodelsusingpopularframeworksanddatasetswiththerequireddesignpatternsforeacharchitecture.Youwillbereadytoexplorethepotentialofdeeparchitecturesintoday'sworld.
目錄(202章)
倒序
- coverpage
- Title Page
- Copyright and Credits
- Hands-On Deep Learning Architectures with Python
- About Packt
- Why subscribe?
- Packt.com
- Contributors
- About the authors
- About the reviewers
- Packt is searching for authors like you
- Preface
- Who this book is for
- What this book covers
- To get the most out of this book
- Download the example code files
- Download the color images
- Conventions used
- Get in touch
- Reviews
- Section 1: The Elements of Deep Learning
- Getting Started with Deep Learning
- Artificial intelligence
- Machine learning
- Supervised learning
- Regression
- Classification
- Unsupervised learning
- Reinforcement learning
- Deep learning
- Applications of deep learning
- Self-driving cars
- Image translation
- Machine translation
- Encoder-decoder structure
- Chatbots
- Building the fundamentals
- Biological inspiration
- ANNs
- Activation functions
- Linear activation
- Sigmoid activation
- Tanh activation
- ReLU activation
- Softmax activation
- TensorFlow and Keras
- Setting up the environment
- Introduction to TensorFlow
- Installing TensorFlow CPU
- Installing TensorFlow GPU
- Testing your installation
- Getting to know TensorFlow
- Building a graph
- Creating a Session
- Introduction to Keras
- Sequential API
- Functional API
- Summary
- Deep Feedforward Networks
- Evolutionary path to DFNs
- Architecture of DFN
- Training
- Loss function
- Regression loss
- Mean squared error (MSE)
- Mean absolute error
- Classification loss
- Cross entropy
- Gradient descent
- Types of gradient descent
- Batch gradient descent
- Stochastic gradient descent
- Mini-batch gradient descent
- Backpropagation
- Optimizers
- Train test and validation
- Training set
- Validation set
- Test set
- Overfitting and regularization
- L1 and L2 regularization
- Dropout
- Early stopping
- Building our first DFN
- MNIST fashion data
- Getting the data
- Visualizing data
- Normalizing and splitting data
- Model parameters
- One-hot encoding
- Building a model graph
- Adding placeholders
- Adding layers
- Adding loss function
- Adding an optimizer
- Calculating accuracy
- Running a session to train
- The easy way
- Summary
- Restricted Boltzmann Machines and Autoencoders
- What are RBMs?
- The evolution path of RBMs
- RBM architectures and applications
- RBM and their implementation in TensorFlow
- RBMs for movie recommendation
- DBNs and their implementation in TensorFlow
- DBNs for image classification
- What are autoencoders?
- The evolution path of autoencoders
- Autoencoders architectures and applications
- Vanilla autoencoders
- Deep autoencoders
- Sparse autoencoders
- Denoising autoencoders
- Contractive autoencoders
- Summary
- Exercise
- Acknowledgements
- Section 2: Convolutional Neural Networks
- CNN Architecture
- Problem with deep feedforward networks
- Evolution path to CNNs
- Architecture of CNNs
- The input layer
- The convolutional layer
- The maxpooling layer
- The fully connected layer
- Image classification with CNNs
- VGGNet
- InceptionNet
- ResNet
- Building our first CNN
- CIFAR
- Data loading and pre-processing
- Object detection with CNN
- R-CNN
- Faster R-CNN
- You Only Look Once (YOLO)
- Single Shot Multibox Detector
- TensorFlow object detection zoo
- Summary
- Mobile Neural Networks and CNNs
- Evolution path to MobileNets
- Architecture of MobileNets
- Depth-wise separable convolution
- The need for depth-wise separable convolution
- Structure of MobileNet
- MobileNet with Keras
- MobileNetV2
- Motivation behind MobileNetV2
- Structure of MobileNetV2
- Linear bottleneck layer
- Expansion layer
- Inverted residual block
- Overall architecture
- Implementing MobileNetV2
- Comparing the two MobileNets
- SSD MobileNetV2
- Summary
- Section 3: Sequence Modeling
- Recurrent Neural Networks
- What are RNNs?
- The evolution path of RNNs
- RNN architectures and applications
- Architectures by input and output
- Vanilla RNNs for text generation
- LSTM RNNs
- LSTM RNNs for text generation
- GRU RNNs
- GRU RNNs for stock price prediction
- Bidirectional RNNs
- Bidirectional RNNs for sentiment classification
- Summary
- Section 4: Generative Adversarial Networks (GANs)
- Generative Adversarial Networks
- What are GANs?
- Generative models
- Adversarial – training in an adversarial manner
- The evolution path of GANs
- GAN architectures and implementations
- Vanilla GANs
- Deep convolutional GANs
- Conditional GANs
- InfoGANs
- Summary
- Section 5: The Future of Deep Learning and Advanced Artificial Intelligence
- New Trends of Deep Learning
- New trends in deep learning
- Bayesian neural networks
- What our deep learning models don't know – uncertainty
- How we can obtain uncertainty information – Bayesian neural networks
- Capsule networks
- What convolutional neural networks fail to do
- Capsule networks – incorporating oriental and relative spatial relationships
- Meta-learning
- One big challenge in deep learning – training data
- Meta-learning – learning to learn
- Metric-based meta-learning
- Summary
- Other Books You May Enjoy
- Leave a review - let other readers know what you think 更新時間:2021-06-24 14:48:52
推薦閱讀
- 面向STEM的mBlock智能機器人創(chuàng)新課程
- TIBCO Spotfire:A Comprehensive Primer(Second Edition)
- 一本書玩轉(zhuǎn)數(shù)據(jù)分析(雙色圖解版)
- 群體智能與數(shù)據(jù)挖掘
- Maya極速引擎:材質(zhì)篇
- 觸控顯示技術
- 零起點學西門子S7-200 PLC
- 基于神經(jīng)網(wǎng)絡的監(jiān)督和半監(jiān)督學習方法與遙感圖像智能解譯
- Mastering pfSense
- ZigBee無線通信技術應用開發(fā)
- 大數(shù)據(jù)素質(zhì)讀本
- 網(wǎng)管員世界2009超值精華本
- MySQL Management and Administration with Navicat
- Flash CS3動畫制作融會貫通
- Internet of Things with Raspberry Pi 3
- 數(shù)據(jù)庫技術及應用
- Azure Serverless Computing Cookbook
- ESP8266 Internet of Things Cookbook
- ROBOTC FOR LEGO EV3基礎編程與實例
- 巧學活用Excel
- Julia 1.0 Programming Cookbook
- Mastering JBoss Drools 6
- Mobile Game Design Essentials
- Cloud Foundry for Developers
- Predictive Analytics with TensorFlow
- 中國戰(zhàn)略性新興產(chǎn)業(yè)研究與發(fā)展·智慧工業(yè)
- Flash CS3中文版無敵課堂
- 輕松學HTML+CSS網(wǎng)站開發(fā)
- Windows 7使用精解
- PostgreSQL 11 Administration Cookbook