舉報

會員
Natural Language Processing with TensorFlow
Thushan Ganegedara 著
更新時間:2021-06-25 21:28:44
開會員,本書免費讀 >
最新章節(jié):
Index
ThisbookisforPythondeveloperswithastronginterestindeeplearning,whowanttolearnhowtoleverageTensorFlowtosimplifyNLPtasks.FundamentalPythonskillsareassumed,aswellassomeknowledgeofmachinelearningandundergraduate-levelcalculusandlinearalgebra.Nopreviousnaturallanguageprocessingexperiencerequired,althoughsomebackgroundinNLPorcomputationallinguisticswillbehelpful.
最新章節(jié)
- Index
- Summary
- Visualizing word embeddings with TensorBoard
- Introduction to the TensorFlow seq2seq library
- Introduction to Keras
- Probability
品牌:中圖公司
上架時間:2021-06-25 20:48:49
出版社:Packt Publishing
本書數(shù)字版權(quán)由中圖公司提供,并由其授權(quán)上海閱文信息技術(shù)有限公司制作發(fā)行
- Index 更新時間:2021-06-25 21:28:44
- Summary
- Visualizing word embeddings with TensorBoard
- Introduction to the TensorFlow seq2seq library
- Introduction to Keras
- Probability
- Tensor/matrix operations
- Special types of matrices
- Appendix A. Mathematical Foundations and Advanced TensorFlow
- References
- Summary
- Newer machine learning models
- New tasks emerging
- NLP for social media
- Towards Artificial General Intelligence
- Penetration into other research fields
- Current trends in NLP
- Chapter 11. Current Trends and the Future of Natural Language Processing
- Summary
- Other applications of Seq2Seq models – chatbots
- Attention
- Improving NMTs
- Training an NMT jointly with word embeddings
- Implementing an NMT from scratch – a German to English translator
- The BLEU score – evaluating the machine translation systems
- Inference with NMT
- Training the NMT
- Preparing data for the NMT system
- Understanding Neural Machine Translation
- A brief historical tour of machine translation
- Machine translation
- Chapter 10. Sequence-to-Sequence Learning – Neural Machine Translation
- Summary
- Using TensorFlow RNN API with pretrained GloVe word vectors
- Captions generated for test images
- Evaluating the results quantitatively
- Defining the LSTM
- Generating data for LSTMs
- Preparing captions for feeding into LSTMs
- Learning word embeddings
- Implementation – loading weights and inferencing with VGG-
- Extracting image features with CNNs
- The machine learning pipeline for image caption generation
- Getting to know the data
- Chapter 9. Applications of LSTM – Image Caption Generation
- Summary
- Using the TensorFlow RNN API
- Improving LSTMs – generating text with words instead of n-grams
- Improving LSTMs – beam search
- Comparing LSTMs to LSTMs with peephole connections and GRUs
- Implementing an LSTM
- Our data
- Chapter 8. Applications of LSTM – Generating Text
- Summary
- Other variants of LSTMs
- How LSTMs solve the vanishing gradient problem
- Understanding Long Short-Term Memory Networks
- Chapter 7. Long Short-Term Memory Networks
- Summary
- Recurrent Neural Networks with Context Features – RNNs with longer memory
- Perplexity – measuring the quality of the text result
- Evaluating text results output from the RNN
- Generating text with RNNs
- Applications of RNNs
- Backpropagation Through Time
- Understanding Recurrent Neural Networks
- Chapter 6. Recurrent Neural Networks
- Summary
- Using CNNs for sentence classification
- Exercise – image classification on MNIST with CNN
- Understanding Convolution Neural Networks
- Introducing Convolution Neural Networks
- Chapter 5. Sentence Classification with Convolutional Neural Networks
- Summary
- Document classification with Word2vec
- GloVe – Global Vectors representation
- More recent algorithms extending skip-gram and CBOW
- Extensions to the word embeddings algorithms
- Comparing skip-gram with CBOW
- The original skip-gram algorithm
- Chapter 4. Advanced Word2vec
- Summary
- The Continuous Bag-of-Words algorithm
- The skip-gram algorithm
- Word2vec – a neural network-based approach to learning word representation
- Classical approaches to learning word representation
- What is a word representation or meaning?
- Chapter 3. Word2vec – Learning Word Embeddings
- Summary
- Implementing our first neural network
- Reusing variables with scoping
- Inputs variables outputs and operations
- What is TensorFlow?
- Chapter 2. Understanding TensorFlow
- Summary
- Introduction to the technical tools
- The roadmap – beyond this chapter
- The deep learning approach to Natural Language Processing
- The traditional approach to Natural Language Processing
- Tasks of Natural Language Processing
- What is Natural Language Processing?
- Chapter 1. Introduction to Natural Language Processing
- Get in touch
- To get the most out of this book
- What this book covers
- Preface
- Packt is searching for authors like you
- About the reviewers
- Contributors
- PacktPub.com
- 版權(quán)信息
- 封面
- 封面
- 版權(quán)信息
- PacktPub.com
- Contributors
- About the reviewers
- Packt is searching for authors like you
- Preface
- What this book covers
- To get the most out of this book
- Get in touch
- Chapter 1. Introduction to Natural Language Processing
- What is Natural Language Processing?
- Tasks of Natural Language Processing
- The traditional approach to Natural Language Processing
- The deep learning approach to Natural Language Processing
- The roadmap – beyond this chapter
- Introduction to the technical tools
- Summary
- Chapter 2. Understanding TensorFlow
- What is TensorFlow?
- Inputs variables outputs and operations
- Reusing variables with scoping
- Implementing our first neural network
- Summary
- Chapter 3. Word2vec – Learning Word Embeddings
- What is a word representation or meaning?
- Classical approaches to learning word representation
- Word2vec – a neural network-based approach to learning word representation
- The skip-gram algorithm
- The Continuous Bag-of-Words algorithm
- Summary
- Chapter 4. Advanced Word2vec
- The original skip-gram algorithm
- Comparing skip-gram with CBOW
- Extensions to the word embeddings algorithms
- More recent algorithms extending skip-gram and CBOW
- GloVe – Global Vectors representation
- Document classification with Word2vec
- Summary
- Chapter 5. Sentence Classification with Convolutional Neural Networks
- Introducing Convolution Neural Networks
- Understanding Convolution Neural Networks
- Exercise – image classification on MNIST with CNN
- Using CNNs for sentence classification
- Summary
- Chapter 6. Recurrent Neural Networks
- Understanding Recurrent Neural Networks
- Backpropagation Through Time
- Applications of RNNs
- Generating text with RNNs
- Evaluating text results output from the RNN
- Perplexity – measuring the quality of the text result
- Recurrent Neural Networks with Context Features – RNNs with longer memory
- Summary
- Chapter 7. Long Short-Term Memory Networks
- Understanding Long Short-Term Memory Networks
- How LSTMs solve the vanishing gradient problem
- Other variants of LSTMs
- Summary
- Chapter 8. Applications of LSTM – Generating Text
- Our data
- Implementing an LSTM
- Comparing LSTMs to LSTMs with peephole connections and GRUs
- Improving LSTMs – beam search
- Improving LSTMs – generating text with words instead of n-grams
- Using the TensorFlow RNN API
- Summary
- Chapter 9. Applications of LSTM – Image Caption Generation
- Getting to know the data
- The machine learning pipeline for image caption generation
- Extracting image features with CNNs
- Implementation – loading weights and inferencing with VGG-
- Learning word embeddings
- Preparing captions for feeding into LSTMs
- Generating data for LSTMs
- Defining the LSTM
- Evaluating the results quantitatively
- Captions generated for test images
- Using TensorFlow RNN API with pretrained GloVe word vectors
- Summary
- Chapter 10. Sequence-to-Sequence Learning – Neural Machine Translation
- Machine translation
- A brief historical tour of machine translation
- Understanding Neural Machine Translation
- Preparing data for the NMT system
- Training the NMT
- Inference with NMT
- The BLEU score – evaluating the machine translation systems
- Implementing an NMT from scratch – a German to English translator
- Training an NMT jointly with word embeddings
- Improving NMTs
- Attention
- Other applications of Seq2Seq models – chatbots
- Summary
- Chapter 11. Current Trends and the Future of Natural Language Processing
- Current trends in NLP
- Penetration into other research fields
- Towards Artificial General Intelligence
- NLP for social media
- New tasks emerging
- Newer machine learning models
- Summary
- References
- Appendix A. Mathematical Foundations and Advanced TensorFlow
- Special types of matrices
- Tensor/matrix operations
- Probability
- Introduction to Keras
- Introduction to the TensorFlow seq2seq library
- Visualizing word embeddings with TensorBoard
- Summary
- Index 更新時間:2021-06-25 21:28:44