舉報

會員
Deep Learning By Example
Thisbooktargetsdatascientistsandmachinelearningdeveloperswhowishtogetstartedwithdeeplearning.Ifyouknowwhatdeeplearningisbutarenotquitesureofhowtouseit,thisbookwillhelpyouaswell.Anunderstandingofstatisticsanddatascienceconceptsisrequired.SomefamiliaritywithPythonprogrammingwillalsobebeneficial.
最新章節
- Leave a review - let other readers know what you think
- Other Books You May Enjoy
- Code for fish recognition
- Implementing Fish Recognition
- Summary
- Model training
品牌:中圖公司
上架時間:2021-06-24 17:57:50
出版社:Packt Publishing
本書數字版權由中圖公司提供,并由其授權上海閱文信息技術有限公司制作發行
- Leave a review - let other readers know what you think 更新時間:2021-06-24 18:53:26
- Other Books You May Enjoy
- Code for fish recognition
- Implementing Fish Recognition
- Summary
- Model training
- Model optimizer
- Model losses
- Discriminator
- Generator
- Model inputs
- Building the model
- Data analysis and preprocessing
- Intuition
- Semi-supervised learning with Generative Adversarial Networks (GANs)
- Training the model
- Model optimizer
- Model losses
- Generator
- Discriminator
- Model inputs
- Building the model
- Exploring the Data
- Getting the data
- Face generation
- Face Generation and Handling Missing Labels
- Summary
- Sampling from the generator
- Generator samples from training
- Model training
- Optimizers
- Discriminator and generator losses
- Defining the generator and discriminator
- Model hyperparameters
- Building the GAN network
- Discriminator
- Generator
- Leaky ReLU
- Variable scope
- Model inputs
- Simple implementation of GANs
- An intuitive introduction
- Generative Adversarial Networks
- Summary
- More applications
- Image colorization
- Applications of autoencoders
- Model training
- Building the model
- Denoising autoencoders
- Model training
- Building the model
- Dataset
- Convolutional autoencoder
- Model training
- Building the model
- The MNIST dataset
- Compressing the MNIST dataset
- Autoencoder architectures
- Examples of autoencoders
- Introduction to autoencoders
- Autoencoders – Feature Extraction and Denoising
- Summary
- Model training and results analysis
- Building the model
- Data analysis and preprocessing
- Keras
- Sentiment analysis – model implementation
- Exploding and vanishing gradients - recap
- RNNs – sentiment analysis context
- General sentiment analysis architecture
- Neural Sentiment Analysis
- Summary
- Training
- Building the model
- Data analysis and pre-processing
- Skip-gram Word2Vec implementation
- A practical example of the skip-gram architecture
- Building Word2Vec model
- Word2Vec
- Introduction to representation learning
- Representation Learning - Implementing Word Embeddings
- Summary
- Generating text
- Saving checkpoints
- Training the model
- Model hyperparameters
- Building the network
- Optimizer
- Training loss
- RNN output
- Building an LSTM cell
- Inputs
- Model architecture
- Stacked LSTMs
- Building the model
- Mini-batch generation for training
- Implementation of the language model
- Why does LSTM work?
- LSTM networks
- The problem of long-term dependencies
- The vanishing gradient problem
- Language model using Shakespeare data
- Character-level language models
- Examples of RNNs
- Recurrent neural networks architectures
- The intuition behind RNNs
- Recurrent-Type Neural Networks - Language Modeling
- Summary
- Model building and training
- Analysis of transfer values
- Inception model transfer values
- Loading and exploring CIFAR-10
- Solution outline
- CIFAR-10 object detection – revisited
- Differences between traditional machine learning and TL
- The intuition behind TL
- Transfer learning
- Object Detection – Transfer Learning with CNNs
- Summary
- Testing the model
- Model training
- Building the network
- Data analysis and preprocessing
- Loading the CIFAR-10 dataset
- Used packages
- CIFAR-10 – modeling building and training
- Object detection
- Object Detection – CIFAR-10 Example
- Summary
- Model training
- Performance measures
- Cost function
- Building the model
- CNN basic example – MNIST digit classification
- Logits layer
- Fully connected layer
- The pooling step
- Introducing non-linearity
- Convolution step
- Input layer
- Different layers of CNNs
- Applications of CNNs
- Motivation
- The convolution operation
- Introduction to Convolutional Neural Networks
- Summary
- Model training
- Building the model
- Data analysis
- Digit classification – model building and training
- The MNIST data
- MNIST dataset analysis
- Hidden units and architecture design
- Deep Feed-forward Neural Networks - Implementing Digit Classification
- Summary
- Cost function
- Training
- Logistic regression model
- Set model weights and bias
- Why use placeholders?
- Utilizing logistic regression in TensorFlow
- Logistic regression model – building and training
- Linear regression with TensorFlow
- Linear regression model – building and training
- Operations
- Placeholders
- Variables
- Why tensors?
- Defining multidimensional arrays using TensorFlow
- TensorFlow terminologies – recap
- Step 2 – backpropagation and weight updation
- Step 1 – forward propagation
- Training our MLP – the backpropagation algorithm
- The need for multilayer networks
- Feed-forward neural network
- ReLU
- Tanh
- Sigmoid
- Activation functions
- Biological motivation and connections
- Capacity of a single neuron
- TensorFlow in Action - Some Basic Examples
- Summary
- TensorBoard – visualizing learning
- Getting output from TensorFlow
- Mathematical operations
- Placeholders
- Variables
- TensorFlow data types variables and placeholders
- Computational graphs
- The TensorFlow environment
- TensorFlow GPU/CPU installation for Windows
- TensorFlow CPU installation for macOS X
- TensorFlow CPU installation for Ubuntu 16.04
- Installing TensorFlow
- Installing NVIDIA drivers and CUDA 8
- TensorFlow GPU installation for Ubuntu 16.04
- TensorFlow installation
- Get Up and Running with TensorFlow
- Summary
- Breaking the rule of thumb
- Learning visibility
- Bias-variance decomposition
- Titanic example revisited – all together
- Avoiding the curse of dimensionality
- The curse of dimensionality
- Interaction features
- Ticket
- Cabin
- Name
- Derived features
- Binning
- Scaling
- Factorizing
- Dummy features
- Feature transformations
- Using a regression or another simple model to predict the values of missing variables
- Assigning an average value
- Missing value inputting
- Removing any sample with missing values in it
- Missing values
- Titanic example revisited
- Feature construction
- Dimensionality reduction
- Feature selection
- Types of feature engineering
- Feature engineering
- Feature Engineering and Model Complexity – The Titanic Example Revisited
- Summary
- Generalization/true error
- Apparent (training set) error
- Different types of errors
- Data analysis – supervised machine learning
- Data handling and visualization
- Titanic example – model building and training
- Classification and logistic regression
- Linear models for classification
- Using the model for prediction
- Interpreting model coefficients
- Learning model coefficients
- Simple regression model
- Data analysis and visualization
- Understanding the advertising data
- Importing data with pandas
- Dependencies
- Advertising – a financial example
- Motivation
- Linear models for regression
- Data Modeling in Action - The Titanic Example
- Summary
- Data size and industry needs
- Reinforcement learning
- Semi-supervised learning
- Unsupervised learning
- Supervised learning
- Different learning types
- Fish recognition – all together
- Model training and testing
- Model building
- Data analysis pre-processing
- Knowledge base/dataset
- Implementing the fish recognition/detection model
- Missing values
- Prior knowledge
- Selection of a machine learning algorithm
- Overfitting
- Noise
- Feature extraction – feature engineering
- Challenges of learning
- Getting to learn
- Evaluating your model
- Learning process
- Model selection
- Feature selection
- Data pre-processing
- Data cleaning
- Data pre-processing
- Design procedure of data science algorithms
- Understanding data science by an example
- Data Science - A Birds' Eye View
- Reviews
- Get in touch
- Conventions used
- Download the color images
- Download the example code files
- To get the most out of this book
- What this book covers
- Who this book is for
- Preface
- Packt is searching for authors like you
- About the reviewers
- About the author
- Contributors
- PacktPub.com
- Why subscribe?
- Packt Upsell
- Title Page
- coverpage
- coverpage
- Title Page
- Packt Upsell
- Why subscribe?
- PacktPub.com
- Contributors
- About the author
- About the reviewers
- Packt is searching for authors like you
- Preface
- Who this book is for
- What this book covers
- To get the most out of this book
- Download the example code files
- Download the color images
- Conventions used
- Get in touch
- Reviews
- Data Science - A Birds' Eye View
- Understanding data science by an example
- Design procedure of data science algorithms
- Data pre-processing
- Data cleaning
- Data pre-processing
- Feature selection
- Model selection
- Learning process
- Evaluating your model
- Getting to learn
- Challenges of learning
- Feature extraction – feature engineering
- Noise
- Overfitting
- Selection of a machine learning algorithm
- Prior knowledge
- Missing values
- Implementing the fish recognition/detection model
- Knowledge base/dataset
- Data analysis pre-processing
- Model building
- Model training and testing
- Fish recognition – all together
- Different learning types
- Supervised learning
- Unsupervised learning
- Semi-supervised learning
- Reinforcement learning
- Data size and industry needs
- Summary
- Data Modeling in Action - The Titanic Example
- Linear models for regression
- Motivation
- Advertising – a financial example
- Dependencies
- Importing data with pandas
- Understanding the advertising data
- Data analysis and visualization
- Simple regression model
- Learning model coefficients
- Interpreting model coefficients
- Using the model for prediction
- Linear models for classification
- Classification and logistic regression
- Titanic example – model building and training
- Data handling and visualization
- Data analysis – supervised machine learning
- Different types of errors
- Apparent (training set) error
- Generalization/true error
- Summary
- Feature Engineering and Model Complexity – The Titanic Example Revisited
- Feature engineering
- Types of feature engineering
- Feature selection
- Dimensionality reduction
- Feature construction
- Titanic example revisited
- Missing values
- Removing any sample with missing values in it
- Missing value inputting
- Assigning an average value
- Using a regression or another simple model to predict the values of missing variables
- Feature transformations
- Dummy features
- Factorizing
- Scaling
- Binning
- Derived features
- Name
- Cabin
- Ticket
- Interaction features
- The curse of dimensionality
- Avoiding the curse of dimensionality
- Titanic example revisited – all together
- Bias-variance decomposition
- Learning visibility
- Breaking the rule of thumb
- Summary
- Get Up and Running with TensorFlow
- TensorFlow installation
- TensorFlow GPU installation for Ubuntu 16.04
- Installing NVIDIA drivers and CUDA 8
- Installing TensorFlow
- TensorFlow CPU installation for Ubuntu 16.04
- TensorFlow CPU installation for macOS X
- TensorFlow GPU/CPU installation for Windows
- The TensorFlow environment
- Computational graphs
- TensorFlow data types variables and placeholders
- Variables
- Placeholders
- Mathematical operations
- Getting output from TensorFlow
- TensorBoard – visualizing learning
- Summary
- TensorFlow in Action - Some Basic Examples
- Capacity of a single neuron
- Biological motivation and connections
- Activation functions
- Sigmoid
- Tanh
- ReLU
- Feed-forward neural network
- The need for multilayer networks
- Training our MLP – the backpropagation algorithm
- Step 1 – forward propagation
- Step 2 – backpropagation and weight updation
- TensorFlow terminologies – recap
- Defining multidimensional arrays using TensorFlow
- Why tensors?
- Variables
- Placeholders
- Operations
- Linear regression model – building and training
- Linear regression with TensorFlow
- Logistic regression model – building and training
- Utilizing logistic regression in TensorFlow
- Why use placeholders?
- Set model weights and bias
- Logistic regression model
- Training
- Cost function
- Summary
- Deep Feed-forward Neural Networks - Implementing Digit Classification
- Hidden units and architecture design
- MNIST dataset analysis
- The MNIST data
- Digit classification – model building and training
- Data analysis
- Building the model
- Model training
- Summary
- Introduction to Convolutional Neural Networks
- The convolution operation
- Motivation
- Applications of CNNs
- Different layers of CNNs
- Input layer
- Convolution step
- Introducing non-linearity
- The pooling step
- Fully connected layer
- Logits layer
- CNN basic example – MNIST digit classification
- Building the model
- Cost function
- Performance measures
- Model training
- Summary
- Object Detection – CIFAR-10 Example
- Object detection
- CIFAR-10 – modeling building and training
- Used packages
- Loading the CIFAR-10 dataset
- Data analysis and preprocessing
- Building the network
- Model training
- Testing the model
- Summary
- Object Detection – Transfer Learning with CNNs
- Transfer learning
- The intuition behind TL
- Differences between traditional machine learning and TL
- CIFAR-10 object detection – revisited
- Solution outline
- Loading and exploring CIFAR-10
- Inception model transfer values
- Analysis of transfer values
- Model building and training
- Summary
- Recurrent-Type Neural Networks - Language Modeling
- The intuition behind RNNs
- Recurrent neural networks architectures
- Examples of RNNs
- Character-level language models
- Language model using Shakespeare data
- The vanishing gradient problem
- The problem of long-term dependencies
- LSTM networks
- Why does LSTM work?
- Implementation of the language model
- Mini-batch generation for training
- Building the model
- Stacked LSTMs
- Model architecture
- Inputs
- Building an LSTM cell
- RNN output
- Training loss
- Optimizer
- Building the network
- Model hyperparameters
- Training the model
- Saving checkpoints
- Generating text
- Summary
- Representation Learning - Implementing Word Embeddings
- Introduction to representation learning
- Word2Vec
- Building Word2Vec model
- A practical example of the skip-gram architecture
- Skip-gram Word2Vec implementation
- Data analysis and pre-processing
- Building the model
- Training
- Summary
- Neural Sentiment Analysis
- General sentiment analysis architecture
- RNNs – sentiment analysis context
- Exploding and vanishing gradients - recap
- Sentiment analysis – model implementation
- Keras
- Data analysis and preprocessing
- Building the model
- Model training and results analysis
- Summary
- Autoencoders – Feature Extraction and Denoising
- Introduction to autoencoders
- Examples of autoencoders
- Autoencoder architectures
- Compressing the MNIST dataset
- The MNIST dataset
- Building the model
- Model training
- Convolutional autoencoder
- Dataset
- Building the model
- Model training
- Denoising autoencoders
- Building the model
- Model training
- Applications of autoencoders
- Image colorization
- More applications
- Summary
- Generative Adversarial Networks
- An intuitive introduction
- Simple implementation of GANs
- Model inputs
- Variable scope
- Leaky ReLU
- Generator
- Discriminator
- Building the GAN network
- Model hyperparameters
- Defining the generator and discriminator
- Discriminator and generator losses
- Optimizers
- Model training
- Generator samples from training
- Sampling from the generator
- Summary
- Face Generation and Handling Missing Labels
- Face generation
- Getting the data
- Exploring the Data
- Building the model
- Model inputs
- Discriminator
- Generator
- Model losses
- Model optimizer
- Training the model
- Semi-supervised learning with Generative Adversarial Networks (GANs)
- Intuition
- Data analysis and preprocessing
- Building the model
- Model inputs
- Generator
- Discriminator
- Model losses
- Model optimizer
- Model training
- Summary
- Implementing Fish Recognition
- Code for fish recognition
- Other Books You May Enjoy
- Leave a review - let other readers know what you think 更新時間:2021-06-24 18:53:26