舉報

會員
R Deep Learning Projects
MachinelearningprofessionalsanddatascientistslookingtomasterdeeplearningbyimplementingpracticalprojectsinRwillfindthisbookausefulresource.AknowledgeofRprogrammingandthebasicconceptsofdeeplearningisrequiredtogetthebestoutofthisbook.
最新章節
- Leave a review - let other readers know what you think
- Other Books You May Enjoy
- Summary
- Using a trained model
- Exploratory data analysis
- Building our model
品牌:中圖公司
上架時間:2021-06-24 18:05:08
出版社:Packt Publishing
本書數字版權由中圖公司提供,并由其授權上海閱文信息技術有限公司制作發行
- Leave a review - let other readers know what you think 更新時間:2021-06-24 19:27:10
- Other Books You May Enjoy
- Summary
- Using a trained model
- Exploratory data analysis
- Building our model
- Connecting to the Twitter API
- Mining sentiment from Twitter
- Exercises
- Other LSTM architectures
- Bi-directional LSTM networks
- Vector embeddings and neural networks
- The importance of data cleansing
- Sentiment extraction
- From words to vectors
- Data preprocessing
- Sentiment analysis from movie reviews
- GloVe
- word2vec
- Word embeddings
- Exercises
- Implementing a benchmark – logistic regression
- Preparing the data
- Bag of words benchmark
- The more the merrier – calculating n-grams instead of single words
- Working with tidy text
- Warm-up – data exploration
- Sentiment Analysis with Word Embeddings
- Summary
- Exercises
- Generating new text from old
- A simple benchmark implementation
- RNN using Keras
- RNN without derivatives — the cross-entropy method
- Implementation without R6
- Implementation as an R6 class
- Implementing a RNN
- Multi-layer perceptron
- Logistic regression
- Perceptron as an R6 class
- Classes in R with R6
- RNNs from scratch in R
- GRU
- LSTM
- LSTM and GRU networks
- But what is a recurrent neural network really?
- What is so exciting about recurrent neural networks?
- Text Generation Using Recurrent Neural Networks
- Summary
- Exercises
- Autoencoder on the matrix representation
- From text to matrix representation — the Enron dataset
- From unstructured text data to a matrix
- Text fraud detection
- Outlier detection in MNIST
- Image reconstruction using VAEs
- Variational Autoencoders
- Exercises
- Fraud detection with H2O
- The autoencoder approach – Keras
- Exploratory data analysis
- Credit card fraud detection with autoencoders
- Outlier detection in MNIST
- Autoencoders and MNIST
- A simple 2D example
- Our first examples
- Installing H2O
- Installing Keras and TensorFlow for R
- Getting ready
- Fraud Detection with Autoencoders
- Summary
- Reviewing methods to prevent overfitting in CNNs
- Dealing with a small training set – data augmentation
- Reducing overfitting with dropout
- Trying something new – CNNs using Keras with TensorFlow
- First solution – convolutional neural networks using MXNet
- Getting started with exploring GTSRB
- Traffic sign recognition using CNN
- How does deep learning become a state-of-the-art solution?
- How is deep learning applied in self-driving cars?
- Traffic Sign Recognition for Intelligent Vehicles
- Summary
- Extracting richer representation with CNNs
- Adding more hidden layers to the networks
- Going from logistic regression to single-layer neural networks
- First attempt – logistic regression
- Get started with exploring MNIST
- Handwritten digit recognition using CNNs
- What are the applications of deep learning?
- What makes deep learning special?
- What is deep learning and why do we need it?
- Handwritten Digit Recognition Using Convolutional Neural Networks
- Reviews
- Get in touch
- Conventions used
- Download the example code files
- To get the most out of this book
- What this book covers
- Who this book is for
- Preface
- Packt is searching for authors like you
- About the reviewer
- About the authors
- Contributors
- PacktPub.com
- Why subscribe?
- Packt Upsell
- Title Page
- coverpage
- coverpage
- Title Page
- Packt Upsell
- Why subscribe?
- PacktPub.com
- Contributors
- About the authors
- About the reviewer
- Packt is searching for authors like you
- Preface
- Who this book is for
- What this book covers
- To get the most out of this book
- Download the example code files
- Conventions used
- Get in touch
- Reviews
- Handwritten Digit Recognition Using Convolutional Neural Networks
- What is deep learning and why do we need it?
- What makes deep learning special?
- What are the applications of deep learning?
- Handwritten digit recognition using CNNs
- Get started with exploring MNIST
- First attempt – logistic regression
- Going from logistic regression to single-layer neural networks
- Adding more hidden layers to the networks
- Extracting richer representation with CNNs
- Summary
- Traffic Sign Recognition for Intelligent Vehicles
- How is deep learning applied in self-driving cars?
- How does deep learning become a state-of-the-art solution?
- Traffic sign recognition using CNN
- Getting started with exploring GTSRB
- First solution – convolutional neural networks using MXNet
- Trying something new – CNNs using Keras with TensorFlow
- Reducing overfitting with dropout
- Dealing with a small training set – data augmentation
- Reviewing methods to prevent overfitting in CNNs
- Summary
- Fraud Detection with Autoencoders
- Getting ready
- Installing Keras and TensorFlow for R
- Installing H2O
- Our first examples
- A simple 2D example
- Autoencoders and MNIST
- Outlier detection in MNIST
- Credit card fraud detection with autoencoders
- Exploratory data analysis
- The autoencoder approach – Keras
- Fraud detection with H2O
- Exercises
- Variational Autoencoders
- Image reconstruction using VAEs
- Outlier detection in MNIST
- Text fraud detection
- From unstructured text data to a matrix
- From text to matrix representation — the Enron dataset
- Autoencoder on the matrix representation
- Exercises
- Summary
- Text Generation Using Recurrent Neural Networks
- What is so exciting about recurrent neural networks?
- But what is a recurrent neural network really?
- LSTM and GRU networks
- LSTM
- GRU
- RNNs from scratch in R
- Classes in R with R6
- Perceptron as an R6 class
- Logistic regression
- Multi-layer perceptron
- Implementing a RNN
- Implementation as an R6 class
- Implementation without R6
- RNN without derivatives — the cross-entropy method
- RNN using Keras
- A simple benchmark implementation
- Generating new text from old
- Exercises
- Summary
- Sentiment Analysis with Word Embeddings
- Warm-up – data exploration
- Working with tidy text
- The more the merrier – calculating n-grams instead of single words
- Bag of words benchmark
- Preparing the data
- Implementing a benchmark – logistic regression
- Exercises
- Word embeddings
- word2vec
- GloVe
- Sentiment analysis from movie reviews
- Data preprocessing
- From words to vectors
- Sentiment extraction
- The importance of data cleansing
- Vector embeddings and neural networks
- Bi-directional LSTM networks
- Other LSTM architectures
- Exercises
- Mining sentiment from Twitter
- Connecting to the Twitter API
- Building our model
- Exploratory data analysis
- Using a trained model
- Summary
- Other Books You May Enjoy
- Leave a review - let other readers know what you think 更新時間:2021-06-24 19:27:10