forecasting techniques

61
Forecasting Models – Chapter 2 IE 3265 R. Lindeke, Ph. D.

Upload: eugeprime2001

Post on 01-Oct-2015

4 views

Category:

Documents


2 download

DESCRIPTION

Forecasting Techniques and Tips in a Manufacturing Company

TRANSCRIPT

Forecasting Models

Forecasting Models Chapter 2IE 3265R. Lindeke, Ph. D.Introduction to ForecastingWhat is forecasting?Primary Function is to Predict the Future using (time series related or other) data we have in handWhy are we interested?Affects the decisions we make todayWhere is forecasting used in POMforecast demand for products and servicesforecast availability/need for manpowerforecast inventory and materiel needs dailyCharacteristics of ForecastsThey are usually wrong!A good forecast is more than a single numberIncludes a mean value and standard deviationIncludes accuracy range (high and low)Aggregated forecasts are usually more accurateAccuracy erodes as we go further into the future. Forecasts should not be used to the exclusion of known informationWhat Makes a Good Forecast?It should be timelyIt should be as accurate as possibleIt should be reliableIt should be in meaningful unitsIt should be presented in writingThe method should be easy to use and understand in most cases. Forecast Horizons in Operation Planning Figure 2.1

Subjective Forecasting MethodsSales Force Composites Aggregation of sales personnel estimatesCustomer SurveysJury of Executive OpinionThe Delphi MethodIndividual opinions are compiled and considered. These are anonymously shared among group. Then opinion request is repeated until an overall group consensus is (hopefully) reached.Objective Forecasting MethodsTwo primary methods: causal models and time series methodsCausal ModelsLet Y be the quantity to be forecasted and (X1, X2, . . . , Xn) are n variables that have predictive power for Y. A causal model is Y = f (X1, X2, . . . , Xn). A typical relationship is a linear one:Y = a0 + a1X1 + . . . + an XnTime Series MethodsA time series is just collection of past values of the variable being predicted. Also known as nave methods. Goal is to isolate patterns in past data. (See Figures on following pages)TrendSeasonalityCyclesRandomness

Notation ConventionsLet D1, D2, . . . Dn, . . . be the past values of the series to be predicted (demands?). If we are making a forecast during period t (for the future), assume we have observed Dt , Dt-1 etc. Let Ft, t + t = forecast made in period t for the demand in period t + t where t = 1, 2, 3, Then Ft -1, t is the forecast made in t-1 for t and Ft, t+1 is the forecast made in t for t+1. (one step ahead) Use shorthand notation Ft = Ft - 1, tEvaluation of ForecastsThe forecast error in period t, et, is the difference between the forecast for demand in period t and the actual value of demand in t. For a multiple step ahead forecast: et = Ft - t, t - Dt.For one step ahead forecast: et = Ft DtTo evaluate Forecasting accuracy we develop a chart of Forecasting errors using: MAD = (1/n) S | e i | MSE = (1/n) S ei 2Evaluation of ForecastsWe would find that the value MSE0.5 = 1.25MAD = errorWe can set up a Control Chart for tracking errors to study acceptability of our model of the forecast Control limits can be set at 3(error) or 3.75MADThis chart can be used to study bias (trends that deviate in only one direction for some time indicating lags) or very large errors (or growing) indicating a need to review the models Forecast Errors Over Time Figure 2.3

Try Pr. #13; pg 63 with your engr. Team right now!Forecasting for Stationary SeriesA stationary time series has the form:Dt = m + e t where m is a constant and e t is a random variable with mean 0 and var s2Stationary series indicate stable processes without observable trends Two common methods for forecasting stationary series are moving averages and exponential smoothing. Moving AveragesIn words: the arithmetic average of the n most recent observations. For a one-step-ahead forecast: Ft = (1/N) (Dt - 1 + Dt - 2 + . . . + Dt - n )

(Go to Example. Try 2.16 on page 66)Moving Average -- example3 month MA: (oct+nov+dec)/3=258.336 month MA: (jul+aug++dec)/6=249.3312 month MA: (Jan+feb++dec)/12=205.33MONTHDemandMonthDemandJanuary89July223February57August286March144September212April221October275May177November188June280December312Summary of Moving AveragesAdvantages of Moving Average MethodEasily understoodEasily computedProvides stable forecastsDisadvantages of Moving Average MethodRequires saving lots of past data points: at least the N periods used in the moving average computationLags behind a trendIgnores complex relationships in dataMoving Average Lags a Trend Figure 2.4

What about Weighted Moving Averages?This method looks at past data and tries to logically attach importance to certain data over other dataWeighting factors must add to oneCan weight recent higher than older or specific data above othersIf forecasting staffing, we could use data from the last four weeks where Tuesdays are to be forecast.Weighting on Tuesdays is: T-1 is .25; T-2 is .20; T-3 is .15; T-4 is .10 and Average of all other days is weighed .30.Exponential Smoothing MethodA type of weighted moving average that applies declining weights to past data.New Forecast = a (most recent observation) + (1 - a) (last forecast)- OR -

New Forecast = last forecast - a (last forecast error)

where 0 < a < 1 and generally is small for stability of forecasts ( around .1 to .2)Exponential Smoothing (cont.)In symbols:Ft+1 = a Dt + (1 - a ) Ft = a Dt + (1 - a ) (a Dt-1 + (1 - a ) Ft-1) = a Dt + (1 - a )(a )Dt-1 + (1 - a)2 (a )Dt - 2 + . . .

Hence the method applies a set of exponentially declining weights to past data. It is easy to show that the sum of the weights is exactly one. {Or : Ft + 1 = Ft - a (Ft - Dt) }

Weights in Exponential Smoothing:Effect of value on the ForecastSmall values of means that the forecasted value will be stable (show low variabilityLow increases the lag of the forecast to the actual data if a trend is presentLarge values of mean that the forecast will more closely track the actual time seriesLets try one:#22a, pg 72: Given sales history ofJan 23.3Feb 72.3Mar 30.3Apr 15.5And the January Forecast was: 25Using = .15Forecast for Feb: Djan + (1- )Fjan = .15*23.3 + (.85)*25 = 24.745Forecast for Mar: Dfeb + (1- )Ffeb = .15*72.3 + (.85)*24.745 = 31.88Apr: Dmar + (1- )Fmar = .15*30.3 + .85*31.88 = 31.64May: Dapr + (1- )Fapr = .15*15.5 + .85*31.64 = 29.22Comparison of MA and ESSimilaritiesBoth methods are appropriate for stationary seriesBoth methods depend on a single parameterBoth methods lag behind a trendOne can achieve the same distribution of forecast error by setting: a = 2/ ( N + 1) or N = (2 - a)/ aComparison of MA and ESDifferencesES carries all past history (forever!) MA eliminates bad data after N periodsMA requires all N past data points to compute new forecast estimate while ES only requires last forecast and last observation of demand to continueUsing Regression for Times Series ForecastingRegression Methods Can be Used When a Trend is Present. Model: Dt = a + bt + t. If t is scaled to 1, 2, 3, . . . , -- it becomes a number i -- then the least squares estimates for a and b can be computed as follows: (n is the number of observation we have)

Lets try 2-28 on page 75Month# VisitorsMonth# VisitorsJan133Apr640Feb183May1875Mar285Jun2550Sxy=6*(1*133+2*183+3*285+4*640+5*1875+6*2550) (6*7/2)(133+183+285+640+1875+2550)] = 52557 Sxx = [(36*7*13)/6]-[(36*49)/4)]= 105b = (52557/105)= 500.54a = 944.33 500.54*(6+1)/2 = -807.4 Lets try 2-28 on pg 75 (cont.)Forecast for July: a+b*7 = -807.4 + 500.54*7 = 2696Forecast for August: -807.4 + 500.54*8 = 3197and Continued However, once we get real data for July and August, we would need to re-compute Sxx, Sxy, a and b to continue forecasting if we wish to be accurate!Another Method When Trend is PresentDouble exponential smoothing, of which Holts method is the most common example, can also be used to forecast when there is a linear trend present in the data. The method requires separate smoothing constants for slope and intercept. The advantage is that once we begin building a forecast model, we can quickly revise the slope and signal constituents with the separate smoothing coefficients.

Holts Method ExplainedWe begin with an estimate of the intercept and slope at the start (by Lin. Reg.?)Si = Di + (1-)(Si-1 + Gi-1)Gi = (Si Si-1) + (1- )Gi-1Di is obs. demand; Si is current est. of intercept;Gi is current est. of slope; Si-1 is last est. of intercept; Gi-1 is last est. of slopeFt,t+ = St + *Gt (forecast for time into the future)Doing earlier problem with Holts Method:Use a as 1st estimate of intercept (S0): -807.4Use b as 1st estimate of slope (G0): 500.54Working forward (we will use our data in hand to tune our model!)Lets set at .15 and at .1Sjan = .15Djan + (1-.15)(S0 + G0) = .15*133 + .85*(-807.4 + 500.54) = -240.87Gjan = .1(Sjan S0) + (1-.1)(G0) = .1(-240.87 (-807.4)) + .9*500.54 = 101.16+741.81 = 507.13Doing earlier problem with Holts Method:Sfeb = .15Dfeb +.85*(Sjan + Gjan) = .15*183 + .85*(-240.87 + 507.13) = 253.7Gfeb = .1(Sfeb Sjan) + .9*Gjan = .1(253.7 + 240.87) + .9*507.2 = 505.9Continue to refine the estimate through June to build a best estimate model to forecast the futureTo update the model after demand for July is obtained we just reapply Holts Method and continue! Do one with your Team:Try Problem 30 on page 77.Forecasting For Seasonal SeriesSeasonality corresponds to a pattern in the data that repeats at regular intervals. (See figure next slide)Multiplicative seasonal factors: c1 , c2 , . . . , cN where i = 1 is first season of cycle, i = 2 is second season of the cycle, etc.

S ci = N

ci = 1.25 implies a demand 25% higher than the baselineci = 0.75 implies 25% lower than the baselineA Seasonal Demand Series

Quick and Dirty Method of Estimating Seasonal FactorsCompute the sample mean of the entire data set (should be at least several cycles of data)Divide each observation by the sample mean (This gives a factor for each observation)Average the factors for like seasonsThe resulting n numbers will exactly add to N and correspond to the N seasonal factors.

Deseasonalizing a SeriesTo remove seasonality from a series, simply divide each observation in the series by the appropriate seasonal factor. The resulting series will have no seasonality and may then be predicted using an appropriate method. Once a forecast is made on the deseasonalized series, one then multiplies that forecast by the appropriate seasonal factor to obtain a forecast for the original series.Lets do one:Expected Demand Q103 = (205+225)/2 = 215Expected Demand Q203 = (225+248)/2 = 236.5Expected Demand Q303 = (185+203)/2 = 194Expected Demand Q403 = (285+310)/2 = 298Overall average demand: Di/8 = 235.75SeasonCycleDemandSeasonCycleDemandQ12001205Q12002225Q22001225Q22002248Q32001185Q32002203Q42001285Q42002310Lets do one:Seasonal Demand Factors:Q1: 215/235.75 = 0.912Q2: 236.5/235.75 = 1.003Q3: 194/235.75 = 0.823Q4: 297.5/235.75 = 1.262Sum: 4.000The seasonal factors must sum to the number of seasons in the cycleif they do not factors must be Normalized: {(number of seasons in cycle)/cj}*cji where i represents each season in the cycleDeseasonalize the DemandFor each period: Di/cjWork on Trends on the deseasonalized data

PeriodA. DemandPeriod Avg.Period FactorDeseason Demand (y)Period in string (x)X*YX2Q1 01205215.912224.781224.781Q2 01225236.51.003224.292448.574Q3 01185194.823224.813674.439Q4 012852981.262225.844903.3616Q1 02225215.912246.7251233.625Q2 02248236.51.003247.2161483.2636Q3 02203194.823246.6471726.4849Q4 023102981.262245.6681965.2864SUM: 8659.76 204Using Deseasonalized TableauHere, Xavg = (1+2+3+4+5+6+7+8)/8 = 4.5Yavg = 235.75

b =

a =

Finally:Substituting to solve for b: 4.113Solving for a: 217.24Now we reseasonalize our forecasts for 03Q1, 03a+bx217.24+4.113*9254.26 *.912231.9232Q2, 03a+bx217.24+4.113*10258.37*1.003259.2259Q3, 03a+bx217.24+4.113*11262.48*.823216.0216Q4, 03a+bx217.24+4.113*12266.6*1.262336.4336Forecast:But what about new data?Same problem prevails as before updating is expensiveAs new data becomes available, we must start over to get seasonal factors, trend and intercept estimatesIsnt there a method to smooth this seasonalized technique?Yes, its called Winters Method or triple exponential smoothingExploring Winters MethodThis model uses 3 smoothing constantsOne for the signal, one for the trend and one for seasonal factorsUses this model to project the future:

Using Winters Methods:The Time Series:(deseasonalized)

The Trend:

The Seasonal Factors:

Beginning Winters Method:We must derive initial estimates of the 3 values: St, Gt and ctsTypically we set: = 2 = 2Deriving initial estimates takes at least two complete cycles of dataCompute Slope Estimate:Compute sample means for each cycle of data (V1 and V2)

Slope estimate is:

Signal and seasonal factor estimates:Signal:

Seasonal Factor Estimates:

Compute Seasonal Factor Averages & NormalizeAveraging ci:

Normalizing:

We have the Model so lets do one:PeriodObs. DemandPeriodObs. Demand1st Q (01)721st Q (02)832nd Q (01)1072nd Q (02)1213rd Q (01)553rd Q (02)634th Q (01)884th Q (02)100Initial Slope & Intercept Estimates:

The 4 Seasonal Factors (1st season in each cycle)

Same approach: Seasons 2, 3 & 42nd Season: c-6 & c-2 (j = 2) 1.353; 1.340: avg is 1.3463rd Season: c-5 & c-1 (j = 3) .671; .676: avg is 0.6744th Season: c-4 & c0 (j = 4) 1.039; 1.042: avg is 1.040ci = .946 + 1.346 + .676 + 1.040 = 4.008so may want to (should?) normalize!Normalizing:Here:c1st = (4/4.008)*.946 = .944c2nd = (4/4.008)*1.346 = 1.343c3rd = (4/4.008)*.676 = .675c4th = (4/4.008)*1.040 = 1.038

Making a Forecast:

Updating Model (if D1 is 112)

With updated c1stQ we may have to renormalize all cisFine-tuning the ModelTo adequately employ Winters method, we need 3 cycles of historic dataTwo cycles for model development3rd cycle for fine tuning by update methodForecast can then begin after adjustmentsUpdate each term (S,G and cis as new facts roll in)Lets Try 1:Prob 35 & 36 on page 88 in text.Practical ConsiderationsOverly sophisticated forecasting methods can be problematic, especially for long term forecasting. (Refer to Figure on the next slide.)

Tracking signals may be useful for indicating forecast bias.

Box-Jenkins methods require substantial data history, use the correlation structure of the data, and can provide significantly improved forecasts under some circumstances.

The Difficulty with Long-Term Forecasts an afterthought!