舉報

會員
Deep Learning with Keras
最新章節:
API changes
IfyouareadatascientistwithexperienceinmachinelearningoranAIprogrammerwithsomeexposuretoneuralnetworks,youwillfindthisbookausefulentrypointtodeep-learningwithKeras.AknowledgeofPythonisrequiredforthisbook.
- API changes 更新時間:2021-07-02 23:58:32
- Installing Keras 2.0
- Keras 2.0 - what is new
- Conclusion
- Summary
- The road ahead
- Example - Keras deep Q-network for catch
- Experience replay or the value of experience
- Balancing exploration with exploitation
- The deep Q-network as a Q-function
- Q-learning
- Maximizing future rewards
- Reinforcement learning
- AI Game Playing
- Summary
- Keras example - style transfer
- Keras example - deep dreaming
- Generative models
- Keras example - building a custom normalization layer
- Keras example - using the lambda layer
- Customizing Keras
- Keras example - memory network for question answering
- Composing deep networks
- Keras autoencoder example - sentence vectors
- Unsupervised learning - autoencoders
- Keras regression example - predicting benzene levels in the air
- Regression networks
- Keras functional API
- Additional Deep Learning Models
- Summary
- Other RNN variants
- Stateful LSTM with Keras - predicting electricity consumption
- Stateful RNNs
- Bidirectional RNNs
- GRU with Keras - POS tagging
- Gated recurrent unit - GRU
- LSTM with Keras - sentiment analysis
- Long short term memory - LSTM
- Vanishing and exploding gradients
- RNN topologies
- SimpleRNN with Keras - generating text
- SimpleRNN cells
- Recurrent Neural Network — RNN
- Summary
- Look up embeddings
- Fine-tune learned embeddings from GloVe
- Fine-tuning learned embeddings from word2vec
- Learn embeddings from scratch
- Using pre-trained embeddings
- Exploring GloVe
- Using third-party implementations of word2vec
- Extracting word2vec embeddings from the model
- The CBOW word2vec model
- The skip-gram word2vec model
- word2vec
- Distributed representations
- Word Embeddings
- Summary
- WaveNet - a generative model for learning how to produce audio
- Keras adversarial GANs for forging CIFAR
- Keras adversarial GANs for forging MNIST
- Deep convolutional generative adversarial networks
- Some GAN applications
- What is a GAN?
- Generative Adversarial Networks and WaveNet
- Summary
- Very deep inception-v3 net used for transfer learning
- Recycling pre-built deep learning models for extracting features
- Utilizing Keras built-in VGG-16 net module
- Recognizing cats with a VGG-16 net
- Very deep convolutional networks for large-scale image recognition
- Predicting with CIFAR-10
- Improving the CIFAR-10 performance with data augmentation
- Improving the CIFAR-10 performance with deeper a network
- Recognizing CIFAR-10 images with deep learning
- Understanding the power of deep learning
- LeNet code in Keras
- An example of DCNN - LeNet
- ConvNets summary
- Average pooling
- Max-pooling
- Pooling layers
- Shared weights and bias
- Local receptive fields
- Deep convolutional neural network - DCNN
- Deep Learning with ConvNets
- Summary
- Using Quiver and Keras
- Using TensorBoard and Keras
- Checkpointing
- Callbacks for customizing the training process
- Saving and loading the weights and the architecture of a model
- Some useful operations
- An overview of optimizers
- An overview of metrics
- An overview of losses functions
- An overview of predefined activation functions
- Batch normalization
- Regularization
- Convolutional and pooling layers
- Recurrent neural networks - simple LSTM and GRU
- Regular dense
- An overview of predefined neural network layers
- Functional composition
- Sequential composition
- Composing models in Keras
- What is a tensor?
- Getting started with Keras architecture
- Keras API
- Installing Keras on Microsoft Azure
- Installing Keras on Amazon AWS
- Installing Keras on Google Cloud ML
- Installing Keras on Docker
- Configuring Keras
- Step 5 - testing Theano TensorFlow and Keras
- Step 4 - install Keras
- Step 3 - install TensorFlow
- Step 2 - install Theano
- Step 1 - install some useful dependencies
- Installing Keras
- Keras Installation and API
- Summary
- Towards a deep learning approach
- A practical overview of backpropagation
- Predicting output
- Hyperparameters tuning
- Adopting regularization for avoiding overfitting
- Summarizing the experiments run for recognizing handwritten charts
- Increasing the size of batch computation
- Increasing the number of internal hidden neurons
- Controlling the optimizer learning rate
- Increasing the number of epochs
- Testing different optimizers in Keras
- Further improving the simple net in Keras with dropout
- Improving the simple net in Keras with hidden layers
- Running a simple Keras net and establishing a baseline
- Defining a simple neural net in Keras
- One-hot encoding - OHE
- A real example - recognizing handwritten digits
- Activation functions
- Activation function - ReLU
- Activation function - sigmoid
- Problems in training the perceptron and a solution
- Multilayer perceptron - the first example of a network
- The first example of Keras code
- Perceptron
- Neural Networks Foundations
- Questions
- Piracy
- Errata
- Downloading the color images of this book
- Downloading the example code
- Customer support
- Reader feedback
- Conventions
- Who this book is for
- What you need for this book
- What this book covers
- How deep learning is different from machine learning and artificial intelligence
- Mission
- Preface
- Customer Feedback
- www.PacktPub.com
- About the Reviewer
- About the Authors
- Credits
- 版權信息
- 封面
- 封面
- 版權信息
- Credits
- About the Authors
- About the Reviewer
- www.PacktPub.com
- Customer Feedback
- Preface
- Mission
- How deep learning is different from machine learning and artificial intelligence
- What this book covers
- What you need for this book
- Who this book is for
- Conventions
- Reader feedback
- Customer support
- Downloading the example code
- Downloading the color images of this book
- Errata
- Piracy
- Questions
- Neural Networks Foundations
- Perceptron
- The first example of Keras code
- Multilayer perceptron - the first example of a network
- Problems in training the perceptron and a solution
- Activation function - sigmoid
- Activation function - ReLU
- Activation functions
- A real example - recognizing handwritten digits
- One-hot encoding - OHE
- Defining a simple neural net in Keras
- Running a simple Keras net and establishing a baseline
- Improving the simple net in Keras with hidden layers
- Further improving the simple net in Keras with dropout
- Testing different optimizers in Keras
- Increasing the number of epochs
- Controlling the optimizer learning rate
- Increasing the number of internal hidden neurons
- Increasing the size of batch computation
- Summarizing the experiments run for recognizing handwritten charts
- Adopting regularization for avoiding overfitting
- Hyperparameters tuning
- Predicting output
- A practical overview of backpropagation
- Towards a deep learning approach
- Summary
- Keras Installation and API
- Installing Keras
- Step 1 - install some useful dependencies
- Step 2 - install Theano
- Step 3 - install TensorFlow
- Step 4 - install Keras
- Step 5 - testing Theano TensorFlow and Keras
- Configuring Keras
- Installing Keras on Docker
- Installing Keras on Google Cloud ML
- Installing Keras on Amazon AWS
- Installing Keras on Microsoft Azure
- Keras API
- Getting started with Keras architecture
- What is a tensor?
- Composing models in Keras
- Sequential composition
- Functional composition
- An overview of predefined neural network layers
- Regular dense
- Recurrent neural networks - simple LSTM and GRU
- Convolutional and pooling layers
- Regularization
- Batch normalization
- An overview of predefined activation functions
- An overview of losses functions
- An overview of metrics
- An overview of optimizers
- Some useful operations
- Saving and loading the weights and the architecture of a model
- Callbacks for customizing the training process
- Checkpointing
- Using TensorBoard and Keras
- Using Quiver and Keras
- Summary
- Deep Learning with ConvNets
- Deep convolutional neural network - DCNN
- Local receptive fields
- Shared weights and bias
- Pooling layers
- Max-pooling
- Average pooling
- ConvNets summary
- An example of DCNN - LeNet
- LeNet code in Keras
- Understanding the power of deep learning
- Recognizing CIFAR-10 images with deep learning
- Improving the CIFAR-10 performance with deeper a network
- Improving the CIFAR-10 performance with data augmentation
- Predicting with CIFAR-10
- Very deep convolutional networks for large-scale image recognition
- Recognizing cats with a VGG-16 net
- Utilizing Keras built-in VGG-16 net module
- Recycling pre-built deep learning models for extracting features
- Very deep inception-v3 net used for transfer learning
- Summary
- Generative Adversarial Networks and WaveNet
- What is a GAN?
- Some GAN applications
- Deep convolutional generative adversarial networks
- Keras adversarial GANs for forging MNIST
- Keras adversarial GANs for forging CIFAR
- WaveNet - a generative model for learning how to produce audio
- Summary
- Word Embeddings
- Distributed representations
- word2vec
- The skip-gram word2vec model
- The CBOW word2vec model
- Extracting word2vec embeddings from the model
- Using third-party implementations of word2vec
- Exploring GloVe
- Using pre-trained embeddings
- Learn embeddings from scratch
- Fine-tuning learned embeddings from word2vec
- Fine-tune learned embeddings from GloVe
- Look up embeddings
- Summary
- Recurrent Neural Network — RNN
- SimpleRNN cells
- SimpleRNN with Keras - generating text
- RNN topologies
- Vanishing and exploding gradients
- Long short term memory - LSTM
- LSTM with Keras - sentiment analysis
- Gated recurrent unit - GRU
- GRU with Keras - POS tagging
- Bidirectional RNNs
- Stateful RNNs
- Stateful LSTM with Keras - predicting electricity consumption
- Other RNN variants
- Summary
- Additional Deep Learning Models
- Keras functional API
- Regression networks
- Keras regression example - predicting benzene levels in the air
- Unsupervised learning - autoencoders
- Keras autoencoder example - sentence vectors
- Composing deep networks
- Keras example - memory network for question answering
- Customizing Keras
- Keras example - using the lambda layer
- Keras example - building a custom normalization layer
- Generative models
- Keras example - deep dreaming
- Keras example - style transfer
- Summary
- AI Game Playing
- Reinforcement learning
- Maximizing future rewards
- Q-learning
- The deep Q-network as a Q-function
- Balancing exploration with exploitation
- Experience replay or the value of experience
- Example - Keras deep Q-network for catch
- The road ahead
- Summary
- Conclusion
- Keras 2.0 - what is new
- Installing Keras 2.0
- API changes 更新時間:2021-07-02 23:58:32