time series
TRANSCRIPT
1
TIME SERIES
Prepared by : Fourat Adel Haitham Abdel-atty.Supervised by : Prof .Dr . Mostafa Gadal-Haqq
2
Introduction Importance of time series Time series components Smoothing Methods Applications Neural networks for time-series
forecasting Examples
Agenda
3
Time Series Is a collection of observations of well-defined data
items obtained through repeated measurements over time.
An ordered sequence of values of a variable at equally spaced time intervals.
For example, measuring the value of retail sales each month of the year would comprise a time series. This is because sales revenue is well defined, and consistently measured at equally spaced intervals. Data collected irregularly or only once are not time series.
Introduction
4
Time Series Analysis Analyzing time series data in order to extract
meaningful statistics and other characteristics of the data.
Time Series Forecasting Estimating many future aspects of a business or
other operation based on the current time series.
Introduction (Cont.)
5
Goals of time series There are two main goals
1) Identifying the nature of the phenomenon represented by the sequence of observations.
2) Forecasting (predicting future values of the time series variable).
Introduction (Cont.)
6
A very popular tool for Business Forecasting.
Basis for understanding past behavior. Can forecast future activities/planning for
future operations. Evaluate current accomplishments of
performance. Facilitates comparison
Importance of time series
7
Time series patterns can be described in terms of four basic classes of components: Trend, Seasonal, Cyclical, and Irregular.
Time series components
8
Trend Component Simply, Trend is the long term direction of a time
series. A trend exists when there is a long-term increase
or decrease in the data. It does not have to be linear. Sometimes we will refer to a trend “changing direction” when it might go from an increasing trend to a decreasing trend.
Time series components (Cont)
9
Seasonal Component A regular patterns of variability within certain
time periods, such as a year.
Time series components (Cont)
10
Cyclical Component Any regular pattern of sequences of values
above and below the trend line lasting more than one year.
Regularly occur but may vary in length.
Time series components (Cont)
11
Irregular Component (Random Component )
The variability that is contained within a process that cannot be determined. These fluctuations and variations are caused by erratic and irregular actions that are the result of random chance.
Caused by irregular and unpredictable changes in a times series that are not caused by other components.
Unpredictable, random, “residual” fluctuations. Noise in the time series.
Time series components (Cont)
12
Smoothing data Removes random variation and shows trends
and cyclic components. When a time series contains a large amount of
noise, it can be difficult to visualize any underlying trend.
There are two distinct groups of smoothing methods A. Averaging Smoothing Methods.B. Exponential Smoothing Methods.
Smoothing Methods
13
Simple Averaging Method The Simple Moving Average smooth past data
by arithmetically averaging over a specified period and projecting forward in time. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.
A moving average is commonly used with time series data to smooth out short-term fluctuations and highlight longer-term trends or cycles.
Smoothing Methods (Cont)
14
Simple Averaging Method Example :
Smoothing Methods (Cont)
Week Demand1 6502 6783 7204 7855 8596 9207 8508 7589 892
10 92011 78912 844
F = A + A + A +...+A
ntt-1 t-2 t-3 t-n
Assume that n=3Assume that n=3
(n data items) Moving Average = ----------------------------
n
15
Simple Averaging Method Example :
Smoothing Methods (Cont)
Week Demand n=3 n=61 6502 6783 7204 785 682.675 859 727.676 920 788.007 850 854.67 768.678 758 876.33 802.009 892 842.67 815.33
10 920 833.33 844.0011 789 856.67 866.5012 844 867.00 854.83
F4=(650+678+720)/3
=682.67F7=(650+678+720 +785+859+920)/6
=768.67
16
Weighted Averaging Methods A simple moving average assigns the same
weight to each observation in averaging Weighted moving average assigns different
weights to each observation Most recent observation receives the most
weight, and the weight decreases for older data values
In either case, the sum of the weights = 1
Smoothing Methods (Cont)
17
Weighted Averaging Methods Example :
Smoothing Methods (Cont)
Weights: t-1 .5t-2 .3t-3 .2
Week Demand1 6502 6783 7204
Note that the weights place more emphasis on the most recent data, that is time period “t-1”
Note that the weights place more emphasis on the most recent data, that is time period “t-1”
18
Weighted Averaging Methods Example :
Smoothing Methods (Cont)
Week Demand Forecast1 6502 6783 7204 693.4
F4 = 0.5(720)+0.3(678)+0.2(650)=693.4
Weights: t-1 .5t-2 .3t-3 .2
19
Exponential smoothing Method Is a technique that can be applied to time series
data, either to produce smoothed data for presentation, or to make forecasts.
Exponential smoothing methods give larger weights to more recent observations, and the weights decrease exponentially as the observations become more distant (older).
Simply, weights decline exponentially. In other words, recent observations are given
relatively more weight in forecasting than the older observations.
Smoothing Methods (Cont)
20
Economic Forecasting Sales Forecasting Budgetary Analysis Stock Market Analysis Process and Quality Control Inventory Studies Workload Projections Utility Studies
Applications
21
Data Issues Network Design Model Selection and Evaluation Methodological Issues
Neural networks for time-series forecasting
22
• Developing a neural network model for a time series forecasting application is not a trivial task.
• Neural network modeling issues include the choice of network type and architecture, the
training algorithm, as well as model validation, evaluation, and selection.
Neural Network Modeling Issues
23
• The major decisions a NN forecaster must make include data preparation, data cleaning, data splitting, and input variable selection.
• Size of the sample used:– A larger sample provides a better chance for neural networks to
adequately capture the underlying data-generating process.• data splitting:
– According to Chatfield (2001), forecasting analysts typically retain about 10% of the data as a hold-out sample
– Granger (1993) suggests that at least 20% of the data should be held evaluation.
– Time series data are difficult or impossible to split randomly because of the desire to keep the autocorrelation structure of the time series observations.
• Data preprocessing:– Input data normalization to create more uniform data to facilitate
neural network learning, meet algorithm requirements, and avoid computation problems
Data Issues
24
• In most time series forecasting problems, one output node is naturally used for one-step-ahead forecasting.
• One output node can also be employed for multi-step ahead forecasting, in which case iterative forecasting mode must be used.
• for multistep forecasting, one may either use multiple output nodes or develop multiple neural networks each for one particular step forecasting.
Network Design
25
• The selection of a NN model is typically done using the cross-validation process.
• The principle of parsimony must be applied.• After the modeling process, the selected model
must be evaluated.– comparing it to well-accepted traditional models.– using true out-of-sample data– ensuring enough sample size in the out-of-sample data.
Model Selection and Evaluation
26
• Many business and economic time series exhibit both seasonal and trend variations
• Seasonality is a periodic and recurrent pattern caused by factors such as weather, holidays, repeating promotions.
• Traditional analyses of time series are concerned with modeling the autocorrelation structure of a time series, and require the data to be stationary.
• Trend and seasonality in time series violate the condition of stationarity.
Methodological Issues
27
• The trend and seasonality are often estimated and removed from the data first before other components are estimated.
• Preprocessing the data by both detrending and deseasonalization is the most appropriate way to build neural networks for best forecasting performance.
Methodological Issues
28
• There are to options– The iterated method using a general single-step
model to iteratively generate forecasts– direct method – using a tailored model that
forecasts the future value for each forecast horizon.
– Empirical studies yield mixed findings.
Multi-period Forecasting
29
• Designing a neural network for forecasting financial and economic time series[2]
Examples
30
1. Variable selection– Decide whether to use both technical and fundamental or not.
2. Data collection– Handling missing observations: dropped – assueme to be the
same – averaging , etc.3. Data preprocessing
– differences, logs, and ratios4. Training, testing, and validation sets
– walk-forward testing5. Neural network paradigms
– there is no ‘magic’ formula for selecting the number of hidden leyers and neurons.
– nonlinear transfer functions are more appropriate.– sigmoid are commonly used.
Forecasting financial and economic time series
31
6. Evaluation criteria– In case of trading systems, the forecasts would be converted
into buy/sell signals.– They are fed into a program to calculate some type of risk
adjusted return.
7. Neural network training– BP network uses a gradient descent training algorithm which
adjusts the weights.– Number of training iterations: fixed number or convergence
rule.
8. Implementation
Forecasting financial and economic time series
32
Forecasting financial and economic time series
33
A CASE STUDY OF STOCK PRICES OF INTERCONTINENTAL BANK NIGERIA[4]
1. Neural Networks for Time Series Prediction– Error Correction Neural Network – raw input: Closing price , Highest price, Lowest price
2. Neural Networks for Time Series Prediction– Time series may be discrete or continuous
3. The Problem of Predicting the Future– Assume a generative model.
4. Embedding– At TIime t, truncate the history to the previous d samples.
Example(2)
34
5. MODEL DEVELOPMENT
6. Data preparation– The data is transformed to weekly data using average
since we are interested in making weekly predictions.– The number of inputs is four. – The network has two hidden layers each layer having four
neurodes each.– The number of epochs used is 500
A CASE STUDY OF STOCK PRICES OF INTERCONTINENTAL BANK NIGERIA[4]
35
7. Neuro computation– The weighted sums are calculated, and passed
through the activation function F(X)– in our case its F(wSum – T) where wSum is the weighted sum of the
inputs and T is a threshold or bias value.
– For the activation function F, We used the sigmoid function in this project at the hidden and the output layers.
A CASE STUDY OF STOCK PRICES OF INTERCONTINENTAL BANK NIGERIA[4]
36
Results• This neural network is trained to forecast one period
ahead. So for the data : 0.043870 the forecast is 0.032756
• Error in forecast is 25% so the accuracy is 75% and for the next data 0.026320 the forecast 0.032799 Error 24.61
• the forecast accuracy is 75.49%. The Accuracy of the forecast can be improved further by training the network with
• more data.
A CASE STUDY OF STOCK PRICES OF INTERCONTINENTAL BANK NIGERIA[4]
37
1. Zhang, G. Peter. "Neural networks for time-series forecasting." Handbook of Natural Computing. Springer Berlin Heidelberg, 2012. 461-477.
2. Kaastra, Iebeling, and Milton Boyd. "Designing a neural network for forecasting financial and economic time series." Neurocomputing 10.3 (1996): 215-236.
3. https://www.otexts.org/fpp/8/14. Akintola, K. G., B. K. Alese, and A. F. Thompson. "TIME
SERIES FORECASTING WITH NEURAL NETWORK: A CASE STUDY OF STOCK PRICES OF INTERCONTINENTAL BANK NIGERIA." International Journal of Research & Reviews in Applied Sciences 9.3 (2011).
References