舉報

會員
Deep Learning with Theano
最新章節:
Index
Thisbookisindentedtoprovideafulloverviewofdeeplearning.Fromthebeginnerindeeplearningandartificialintelligence,tothedatascientistwhowantstobecomefamiliarwithTheanoanditssupportinglibraries,orhaveanextendedunderstandingofdeepneuralnets.SomebasicskillsinPythonprogrammingandcomputersciencewillhelp,aswellasskillsinelementaryalgebraandcalculus.
最新章節
- Index
- Summary
- Further reading
- The future of artificial intelligence
- Coalesced transpose via shared memory NVIDIA parallel for all
- Theano Op in C for GPU
品牌:中圖公司
上架時間:2021-07-15 16:59:12
出版社:Packt Publishing
本書數字版權由中圖公司提供,并由其授權上海閱文信息技術有限公司制作發行
- Index 更新時間:2021-07-15 17:17:25
- Summary
- Further reading
- The future of artificial intelligence
- Coalesced transpose via shared memory NVIDIA parallel for all
- Theano Op in C for GPU
- Theano Op in C for CPU
- Theano Op in Python for the GPU
- Theano Op in Python for CPU
- Chapter 13. Extending Deep Learning with Theano
- Summary
- Further reading
- Semi-supervised learning
- Generative models
- Chapter 12. Learning Features with Unsupervised Generative Networks
- Summary
- Related articles
- Policy gradients with REINFORCE algorithms
- Training stability
- Deep Q-network
- Q-learning
- Simulation environments
- Reinforcement learning tasks
- Chapter 11. Learning from the Environment with Reinforcement
- Summary
- Further reading
- Recurrent Highway Networks
- Highway networks design principle
- Deep transition recurrent network
- Stacked recurrent networks
- Deep approaches for RNN
- Dropout for RNN
- Chapter 10. Predicting Times Sequences with Advanced RNN
- Summary
- Further reading
- Memory networks
- Store and retrieve information in Neural Turing Machines
- Differentiable mechanism of attention
- Chapter 9. Selecting Relevant Inputs or Memories with the Mechanism of Attention
- Summary
- Further reading
- Multimodal deep learning
- Deconvolutions for images
- Improving efficiency of sequence-to-sequence network
- Seq2seq for chatbots
- Seq2seq for translation
- Sequence-to-sequence networks for natural language processing
- Chapter 8. Translating and Explaining with Encoding – decoding Networks
- Summary
- Further reading
- Data augmentation
- Multi-GPU
- Dense connections
- Stochastic depth
- Residual connections
- Natural image datasets
- Chapter 7. Classifying Images with Residual Networks
- Summary
- Further reading
- Region-based localization networks
- Unsupervised learning with co-localization
- A localization network
- MNIST CNN model with Lasagne
- Chapter 6. Locating with Spatial Transformer Networks
- Summary
- Further reading
- Running the example
- Saving and loading the model
- Evaluating the model
- Compiling and training the model
- Designing the architecture for the model
- Preprocessing text data
- Installing and configuring Keras
- Chapter 5. Analyzing Sentiment with a Bidirectional LSTM
- Summary
- Related articles
- Applications of RNN
- Example of predictions
- Training loss comparison
- Metrics for natural language performance
- Simple recurrent network
- A dataset for natural language
- Need for RNN
- Chapter 4. Generating Text with a Recurrent Neural Net
- Summary
- Further reading
- Weight tying
- Application of word embeddings
- Evaluating embeddings – quantitative analysis
- Evaluating embeddings – analogical reasoning
- Visualizing the learned embeddings
- Training the model
- Continuous Bag of Words model
- Dataset
- Encoding and embedding
- Chapter 3. Encoding Word into Vector
- Summary
- Related articles
- Optimization and other update rules
- Inference
- Dropout
- Training
- Convolutions and max layers
- Multiple layer model
- Backpropagation and stochastic gradient descent
- Cost function and errors
- Single-layer linear model
- Classification loss function
- Structure of a training program
- The MNIST dataset
- Chapter 2. Classifying Handwritten Digits with a Feedforward Network
- Summary
- Configuration profiling and debugging
- Loops in symbolic computing
- Functions and automatic differentiation
- Memory and variables
- Operations on tensors
- Graphs and symbolic computing
- Tensors
- Installing and loading Theano
- The need for tensors
- Chapter 1. Theano Basics
- Customer support
- Reader feedback
- Conventions
- Who this book is for
- What you need for this book
- What this book covers
- Preface
- Customer Feedback
- eBooks discount offers and more
- www.PacktPub.com
- About the Reviewers
- Acknowledgments
- About the Author
- Credits
- Deep Learning with Theano
- 書名頁
- 封面
- 封面
- 書名頁
- Deep Learning with Theano
- Credits
- About the Author
- Acknowledgments
- About the Reviewers
- www.PacktPub.com
- eBooks discount offers and more
- Customer Feedback
- Preface
- What this book covers
- What you need for this book
- Who this book is for
- Conventions
- Reader feedback
- Customer support
- Chapter 1. Theano Basics
- The need for tensors
- Installing and loading Theano
- Tensors
- Graphs and symbolic computing
- Operations on tensors
- Memory and variables
- Functions and automatic differentiation
- Loops in symbolic computing
- Configuration profiling and debugging
- Summary
- Chapter 2. Classifying Handwritten Digits with a Feedforward Network
- The MNIST dataset
- Structure of a training program
- Classification loss function
- Single-layer linear model
- Cost function and errors
- Backpropagation and stochastic gradient descent
- Multiple layer model
- Convolutions and max layers
- Training
- Dropout
- Inference
- Optimization and other update rules
- Related articles
- Summary
- Chapter 3. Encoding Word into Vector
- Encoding and embedding
- Dataset
- Continuous Bag of Words model
- Training the model
- Visualizing the learned embeddings
- Evaluating embeddings – analogical reasoning
- Evaluating embeddings – quantitative analysis
- Application of word embeddings
- Weight tying
- Further reading
- Summary
- Chapter 4. Generating Text with a Recurrent Neural Net
- Need for RNN
- A dataset for natural language
- Simple recurrent network
- Metrics for natural language performance
- Training loss comparison
- Example of predictions
- Applications of RNN
- Related articles
- Summary
- Chapter 5. Analyzing Sentiment with a Bidirectional LSTM
- Installing and configuring Keras
- Preprocessing text data
- Designing the architecture for the model
- Compiling and training the model
- Evaluating the model
- Saving and loading the model
- Running the example
- Further reading
- Summary
- Chapter 6. Locating with Spatial Transformer Networks
- MNIST CNN model with Lasagne
- A localization network
- Unsupervised learning with co-localization
- Region-based localization networks
- Further reading
- Summary
- Chapter 7. Classifying Images with Residual Networks
- Natural image datasets
- Residual connections
- Stochastic depth
- Dense connections
- Multi-GPU
- Data augmentation
- Further reading
- Summary
- Chapter 8. Translating and Explaining with Encoding – decoding Networks
- Sequence-to-sequence networks for natural language processing
- Seq2seq for translation
- Seq2seq for chatbots
- Improving efficiency of sequence-to-sequence network
- Deconvolutions for images
- Multimodal deep learning
- Further reading
- Summary
- Chapter 9. Selecting Relevant Inputs or Memories with the Mechanism of Attention
- Differentiable mechanism of attention
- Store and retrieve information in Neural Turing Machines
- Memory networks
- Further reading
- Summary
- Chapter 10. Predicting Times Sequences with Advanced RNN
- Dropout for RNN
- Deep approaches for RNN
- Stacked recurrent networks
- Deep transition recurrent network
- Highway networks design principle
- Recurrent Highway Networks
- Further reading
- Summary
- Chapter 11. Learning from the Environment with Reinforcement
- Reinforcement learning tasks
- Simulation environments
- Q-learning
- Deep Q-network
- Training stability
- Policy gradients with REINFORCE algorithms
- Related articles
- Summary
- Chapter 12. Learning Features with Unsupervised Generative Networks
- Generative models
- Semi-supervised learning
- Further reading
- Summary
- Chapter 13. Extending Deep Learning with Theano
- Theano Op in Python for CPU
- Theano Op in Python for the GPU
- Theano Op in C for CPU
- Theano Op in C for GPU
- Coalesced transpose via shared memory NVIDIA parallel for all
- The future of artificial intelligence
- Further reading
- Summary
- Index 更新時間:2021-07-15 17:17:25