This book is amid at teaching the readers how to apply the deep learning techniques to the time series forecasting challenges and how to build prediction models using PyTorch.
The readers will learn the fundamentals of PyTorch in the early stages of the book. Next, the time series forecasting is covered in greater depth after the programme has been developed. You will try to use machine learning to identify the patterns that can help us forecast the future results. It covers methodologies such as Recurrent Neural Network, Encoder-decoder model, and Temporal Convolutional Network, all of which are state-of-the-art neural network architectures. Furthermore, for good measure, we have also introduced the neural architecture search, which automates searching for an ideal neural network design for a certain task.
Finally by the end of the book, readers would be able to solve complex real-world prediction issues by applying the models and strategies learnt throughout the course of the book. This book also offers another great way of mastering deep learning and its various techniques.
Cover Page
Title Page
Copyright Page
About the Author
About the Reviewer
Acknowledgement
Preface
Errata
Table of Contents
1. Time Series Problems and Challenges
Structure
Objectives
Introduction to time series analysis and time series forecasting
Time series analysis
Time series forecasting
Time series characteristics
Random walk
Import part
Random walk generation
Trend
Import part
Import part
Result
Seasonality
Import part
Result
Stationarity
Time series common problems
Forecasting
Modelling
Anomaly detection
Classical approaches
Autoregressive model (AR)
Autoregressive integrated moving average model
Result
Seasonal autoregressive integrated moving average
Result
Holt Winter’s exponential smoothing
Result
Classical approaches: Pros and cons
Promise of Deep Learning
Python for time series analysis
Pandas
Numpy
Matplotlib
Statmodels
Scikit-learn
PyTorch
Conclusion
Points to remember
Multiple choice questions
Answers
Key terms
2. Deep Learning with PyTorch
Structure
Objectives
Setting up PyTorch
PyTorch as derivative calculator
Function creation
Computing function value
Result
Import part
Create computational graph
Result
Result
Result
PyTorch basics
Tensors
Tensor creation
Random tensor
Reproducibility
Common tensor types
Tensor methods and attributes
Math functions
Deep Learning layers
Linear layer
Result
Convolution
Result
Kernel
Weight
Padding
Result
Stride
Result
Pooling
Result
Dropout
Result
Activations
ReLU
Result
Sigmoid
Result
Tanh
Neural network architecture
Result
Result
Improving neural network performance
Do not put two same layers in a row
Prefer ReLU activation at first
Start from fully connected network
More layers are better than more neurons
Use dropout
Put Deep Learning blocks in the beginning
Training
Loss functions
Absolute loss
Mean squared error
Smooth L1 loss
Optimizers
Adagrad
Adadelta
Adam
Stochastic Gradient Descent (SGD)
Time series forecasting example
Result
Import part
Train, validation and test datasets
Import part
Conclusion
Points to remember
Multiple choice questions
Answers
Key terms
3. Time Series as Deep Learning Problem
Structure
Objectives
Problem statement
Regression versus classification
Time series regression problems
Time series classification problems
Univariate versus multivariate
Univariate input - univariate output
Multivariate input – univariate output
Multivariate input – multivariate output
Many-to-many
Many-to-one
Single-step versus multi-step
Single-step
Multi-step
Single multi-step model
Multiple single-step model
Recurrent single-step model
Datasets
Feature engineering
Time series pre-processing and post-processing
Normalization
Result
Trend removal
Result
Differencing
Result
Sliding window
Result
Effectiveness and loss function
Static versus dynamic
Architecture design
Training, validating and testing
Alternative model
Model optimization
Summary
Example: UK minimal temperature prediction problem
Dataset
Result
Result
Architecture
Alternative model
Testing
Import part
Making script reproducible
Number of features
Preparing datasets
Initializing models
Loss function and optimization algorithm
Training process
Evaluation on test set
Getting results
Conclusion
Points to remember
Multiple choice questions
Answers
Key terms
4. Recurrent Neural Networks
Structure
Recurrent neural network
Result
Import part
Making this script reproducible
Parameters
Preparing datasets for training
Initializing the model
Training
Evaluation
Performance on test dataset
Training progress
Gated recurrent unit
Result
Import part
Making this script reproducible
Parameters
Preparing datasets for training
Initializing the model
Training
Evaluation
Performance on test dataset
Training progress
Long short-term memory
Result
Import part
Making this script reproducible
Parameters
Preparing datasets for training
Initializing the model
Training
Evaluation
Performance on test dataset
Training progress
Conclusion
Points to remember
Multiple choice questions
Answers
Key terms
5. Advanced Forecasting Models
Structure
Objectives
Encoder–decoder model
Encoder–decoder training
Recursive
Teacher forcing
Mixed teacher forcing
Implementing the encoder–decoder model
Import part
Encoder layer
Decoder layer
Encoder–decoder model class
Training
Model evaluation
Example
Result
Import part
Making script reproducible
Global parameters
Generating datasets
Initializing Encoder–decoder model
Training
Prediction
Visualizing results
Temporal convolutional network
Casual convolution
Dilation
Temporal convolutional network design
Implementing the temporal convolutional network
Import part
Crop layer
Temporal casual layer
Implementing temporal convolutional network
TCN prediction model
Example
Import part
Making script reproducible
Global parameters
Generating time series
Preprocessing
Preparing datasets
Initializing the model
Defining optimizer and loss function
Training
Training progress
Performance on the test dataset
Conclusion
Points to remember
Multiple choice questions
Answer
Key terms
6. PyTorch Model Tuning with Neural Network Intelligence
Structure
Objective
Neural Network Intelligence framework
Hyper-parameter tuning
Search space
Trial
Tuner
Hyper-parameter tuning in action
NNI Quick Start
Import part
Defining search space
Search configuration
NNI API
NNI search space
NNI Trial Integration
Time series model hyper-parameter tuning example
Deep Learning model trial
Import part
Global parameters
Dataset, optimizer, and model initialization
NNI search
Import part
Search space
Maximum number of trials
Search configuration
Neural Architecture Search
Hybrid models
Result
Implementing hybrid model
Import part
Casual convolution layer
Hybrid model
Optional casual convolution layer
Obligatory RNN layer
Optional fully connected layer
Hybrid model
Hybrid model trial
Hybrid model search space
Hybrid model architecture search
Conclusion
Points to remember
Multiple choice questions
Answers
Key terms
7. Applying Deep Learning to Real-world Forecasting Problems
This book is amid at teaching the readers how to apply the deep learning techniques to the time series forecasting challenges and how to build prediction models using PyTorch.
The readers will learn the fundamentals of PyTorch in the early stages of the book. Next, the time series forecasting is covered in greater depth after the programme has been developed. You will try to use machine learning to identify the patterns that can help us forecast the future results. It covers methodologies such as Recurrent Neural Network, Encoder-decoder model, and Temporal Convolutional Network, all of which are state-of-the-art neural network architectures. Furthermore, for good measure, we have also introduced the neural architecture search, which automates searching for an ideal neural network design for a certain task.
Finally by the end of the book, readers would be able to solve complex real-world prediction issues by applying the models and strategies learnt throughout the course of the book. This book also offers another great way of mastering deep learning and its various techniques.
Table of contents
Cover Page
Title Page
Copyright Page
About the Author
About the Reviewer
Acknowledgement
Preface
Errata
Table of Contents
1. Time Series Problems and Challenges
Structure
Objectives
Introduction to time series analysis and time series forecasting
Time series analysis
Time series forecasting
Time series characteristics
Random walk
Import part
Random walk generation
Trend
Import part
Import part
Result
Seasonality
Import part
Result
Stationarity
Time series common problems
Forecasting
Modelling
Anomaly detection
Classical approaches
Autoregressive model (AR)
Autoregressive integrated moving average model
Result
Seasonal autoregressive integrated moving average
Result
Holt Winter’s exponential smoothing
Result
Classical approaches: Pros and cons
Promise of Deep Learning
Python for time series analysis
Pandas
Numpy
Matplotlib
Statmodels
Scikit-learn
PyTorch
Conclusion
Points to remember
Multiple choice questions
Answers
Key terms
2. Deep Learning with PyTorch
Structure
Objectives
Setting up PyTorch
PyTorch as derivative calculator
Function creation
Computing function value
Result
Import part
Create computational graph
Result
Result
Result
PyTorch basics
Tensors
Tensor creation
Random tensor
Reproducibility
Common tensor types
Tensor methods and attributes
Math functions
Deep Learning layers
Linear layer
Result
Convolution
Result
Kernel
Weight
Padding
Result
Stride
Result
Pooling
Result
Dropout
Result
Activations
ReLU
Result
Sigmoid
Result
Tanh
Neural network architecture
Result
Result
Improving neural network performance
Do not put two same layers in a row
Prefer ReLU activation at first
Start from fully connected network
More layers are better than more neurons
Use dropout
Put Deep Learning blocks in the beginning
Training
Loss functions
Absolute loss
Mean squared error
Smooth L1 loss
Optimizers
Adagrad
Adadelta
Adam
Stochastic Gradient Descent (SGD)
Time series forecasting example
Result
Import part
Train, validation and test datasets
Import part
Conclusion
Points to remember
Multiple choice questions
Answers
Key terms
3. Time Series as Deep Learning Problem
Structure
Objectives
Problem statement
Regression versus classification
Time series regression problems
Time series classification problems
Univariate versus multivariate
Univariate input - univariate output
Multivariate input – univariate output
Multivariate input – multivariate output
Many-to-many
Many-to-one
Single-step versus multi-step
Single-step
Multi-step
Single multi-step model
Multiple single-step model
Recurrent single-step model
Datasets
Feature engineering
Time series pre-processing and post-processing
Normalization
Result
Trend removal
Result
Differencing
Result
Sliding window
Result
Effectiveness and loss function
Static versus dynamic
Architecture design
Training, validating and testing
Alternative model
Model optimization
Summary
Example: UK minimal temperature prediction problem
Dataset
Result
Result
Architecture
Alternative model
Testing
Import part
Making script reproducible
Number of features
Preparing datasets
Initializing models
Loss function and optimization algorithm
Training process
Evaluation on test set
Getting results
Conclusion
Points to remember
Multiple choice questions
Answers
Key terms
4. Recurrent Neural Networks
Structure
Recurrent neural network
Result
Import part
Making this script reproducible
Parameters
Preparing datasets for training
Initializing the model
Training
Evaluation
Performance on test dataset
Training progress
Gated recurrent unit
Result
Import part
Making this script reproducible
Parameters
Preparing datasets for training
Initializing the model
Training
Evaluation
Performance on test dataset
Training progress
Long short-term memory
Result
Import part
Making this script reproducible
Parameters
Preparing datasets for training
Initializing the model
Training
Evaluation
Performance on test dataset
Training progress
Conclusion
Points to remember
Multiple choice questions
Answers
Key terms
5. Advanced Forecasting Models
Structure
Objectives
Encoder–decoder model
Encoder–decoder training
Recursive
Teacher forcing
Mixed teacher forcing
Implementing the encoder–decoder model
Import part
Encoder layer
Decoder layer
Encoder–decoder model class
Training
Model evaluation
Example
Result
Import part
Making script reproducible
Global parameters
Generating datasets
Initializing Encoder–decoder model
Training
Prediction
Visualizing results
Temporal convolutional network
Casual convolution
Dilation
Temporal convolutional network design
Implementing the temporal convolutional network
Import part
Crop layer
Temporal casual layer
Implementing temporal convolutional network
TCN prediction model
Example
Import part
Making script reproducible
Global parameters
Generating time series
Preprocessing
Preparing datasets
Initializing the model
Defining optimizer and loss function
Training
Training progress
Performance on the test dataset
Conclusion
Points to remember
Multiple choice questions
Answer
Key terms
6. PyTorch Model Tuning with Neural Network Intelligence
Structure
Objective
Neural Network Intelligence framework
Hyper-parameter tuning
Search space
Trial
Tuner
Hyper-parameter tuning in action
NNI Quick Start
Import part
Defining search space
Search configuration
NNI API
NNI search space
NNI Trial Integration
Time series model hyper-parameter tuning example
Deep Learning model trial
Import part
Global parameters
Dataset, optimizer, and model initialization
NNI search
Import part
Search space
Maximum number of trials
Search configuration
Neural Architecture Search
Hybrid models
Result
Implementing hybrid model
Import part
Casual convolution layer
Hybrid model
Optional casual convolution layer
Obligatory RNN layer
Optional fully connected layer
Hybrid model
Hybrid model trial
Hybrid model search space
Hybrid model architecture search
Conclusion
Points to remember
Multiple choice questions
Answers
Key terms
7. Applying Deep Learning to Real-world Forecasting Problems