Вы находитесь на странице: 1из 59

FASE II: TEMA 6

AAAAAAAAAABBBBBBBBBB

1
Slides by
Slides by
John
John
Loucks
Loucks
St. Edward’s
St. Edward’s
University
University
Chapter 6, Part A
Time Series Analysis and Forecasting
 Quantitative Approaches to Forecasting
 Time Series Patterns
 Forecast Accuracy
 Moving Averages and Exponential Smoothing
Forecasting Methods

 Forecasting methods can be classified as qualitative


or quantitative.
 Qualitative methods generally involve the use of
expert judgment to develop forecasts.
 Such methods are appropriate when historical data
on the variable being forecast are either not
applicable or unavailable.
 We will focus exclusively on quantitative forecasting
methods in this chapter.
Forecasting Methods

 Quantitative forecasting methods can be used when:


 past information about the variable being forecast
is available,
 the information can be quantified, and
 it is reasonable to assume that the pattern of the
past will continue into the future.
Quantitative Forecasting
Methods
 Quantitative methods are based on an analysis of
historical data concerning one or more time series.
 A time series is a set of observations measured at
successive points in time or over successive periods
of time.
 If the historical data used are restricted to past
values of the series that we are trying to forecast, the
procedure is called a time series method.
 If the historical data used involve other time series
that are believed to be related to the time series that
we are trying to forecast, the procedure is called a
causal method.
Time Series Methods

 We will focus on time series methods, and not causal


methods, in this chapter.
 The objective of time series analysis is to discover a
pattern in the historical data or time series and then
extrapolate the pattern into the future.
 The forecast is based solely on past values of the
variable and/or past forecast errors.
Forecasting Methods

Forecasting
Methods

Quantitative Qualitative

Causal Time Series

Focus of this chapter


Time Series Patterns

 A time series is a sequence of measurements taken


every hour, day, week, month, quarter, year, or at
any other regular time interval.
 The pattern of the data is an important factor in
understanding how the time series has behaved in
the past.
 If such behavior can be expected to continue in the
future, we can use it to guide us in selecting an
appropriate forecasting method.
Time Series Plot

 A useful first step in identifying the underlying pattern


in the data is to construct a time series plot.
 A time series plot is a graphical presentation of the
relationship between time and the time series
variable.
 Time is on the horizontal axis, and the time series
values are shown on the vertical axis.
Time Series Plot

 Example
Time Series Patterns

 The common types of data patterns that can be


identified when examining a time series plot include:

Horizontal

Trend

Seasonal

Trend & Seasonal

Cyclical
Time Series Patterns

 Horizontal Pattern
• A horizontal pattern exists when the data
fluctuate around a constant mean.
• Changes in business conditions can often result in
a time series that has a horizontal pattern shifting
to a new level.
• A change in the level of the time series makes it
more difficult to choose an appropriate forecasting
method.
Time Series Patterns

 Trend Pattern
• A time series may show gradual shifts or
movements to relatively higher or lower values
over a longer period of time.
• Trend is usually the result of long-term factors
such as changes in the population, demographics,
technology, or consumer preferences.
• A systematic increase or decrease might be linear
or nonlinear.
• A trend pattern can be identified by analyzing
multiyear movements in historical data.
Time Series Patterns

 Seasonal Pattern
• Seasonal patterns are recognized by seeing the
same repeating pattern of highs and lows over
successive periods of time within a year.
• A seasonal pattern might occur within a day, week,
month, quarter, year, or some other interval no
greater than a year.
• A seasonal pattern does not necessarily refer to the
four seasons of the year (spring, summer, fall, and
winter).
Time Series Patterns

 Trend and Seasonal Pattern


• Some time series include a combination of a trend
and seasonal pattern.
• In such cases we need to use a forecasting method
that has the capability to deal with both trend and
seasonality.
• Time series decomposition can be used to
separate or decompose a time series into trend
and seasonal components.
Time Series Patterns

 Cyclical Pattern
• A cyclical pattern exists if the time series plot shows
an alternating sequence of points below and above
the trend line lasting more than one year.
• Often, the cyclical component of a time series is due
to multiyear business cycles.
• Business cycles are extremely difficult, if not
impossible, to forecast.
• In this chapter we do not deal with cyclical effects
that may be present in the time series.
Selecting a Forecasting Method

 The underlying pattern in the time series is an


important factor in selecting a forecasting method.
 Thus, a time series plot should be one of the first
things developed when trying to determine what
forecasting method to use.
 If we see a horizontal pattern, then we need to select
a method appropriate for this type of pattern.
 If we observe a trend in the data, then we need to
use a method that has the capability to handle trend
effectively.
Forecast Accuracy

 Measures of forecast accuracy are used to


determine how well a particular forecasting method
is able to reproduce the time series data that are
already available.
 Measures of forecast accuracy are important factors
in comparing different forecasting methods.
 By selecting the method that has the best accuracy
for the data already known, we hope to increase the
likelihood that we will obtain better forecasts for
future time periods.
Forecast Accuracy

 The key concept associated with measuring forecast


accuracy is forecast error.

Forecast Error = Actual Value - Forecast

 A positive forecast error indicates the forecasting


method underestimated the actual value.
 A negative forecast error indicates the forecasting
method overestimated the actual value.
Forecast Accuracy

 Mean Error (ME)


A simple measure of forecast accuracy is the mean
or average of the forecast errors. Because positive and
negative forecast errors tend to offset one another, the
mean error is likely to be small. Thus, the mean error
is not a very useful measure.
 Mean Absolute Error (MAE)
This measure avoids the problem of positive and
negative errors offsetting one another. It is the mean
of the absolute values of the forecast errors.
Forecast Accuracy

 Mean Squared Error (MSE)


This is another measure that avoids the problem
of positive and negative errors offsetting one
another. It is the average of the squared forecast
errors.
 Mean Absolute Percentage Error (MAPE)
The size of MAE and MSE depend upon the scale
of the data, so it is difficult to make comparisons for
different time intervals. To make such comparisons
we need to work with relative or percentage error
measures. The MAPE is the average of the absolute
percentage errors of the forecasts.
Forecast Accuracy

 To demonstrate the computation of these measures


of forecast accuracy we will introduce the simplest of
forecasting methods.
 The naïve forecasting method uses the most recent
observation in the time series as the forecast for the
next time period.
  = Actual Value in Period t
Forecast Accuracy

 Example: Rosco Drugs


Sales of Comfort brand headache medicine for the
past 10 weeks at Rosco Drugs are shown below.
If Rosco uses the naïve
forecast method to Week Sales Week Sales
forecast sales for weeks 1 110 6 120
2 – 10, what are the 2 115 7 130
resulting MAE, MSE, 3 125 8 115
and MAPE values? 4 120 9 110
5 125 10 130
Forecast Accuracy

Naïve Forecast Absolute Squared Abs.%


Week Sales Forecast Error Error Error Error
1 110
2 115 110 5 5 25 4.35
3 125 115 10 10 100 8.00
4 120 125 -5 5 25 4.17
5 125 120 5 5 25 4.00
6 120 125 -5 5 25 4.17
7 130 120 10 10 100 7.69
8 115 130 -15 15 125 13.04
9 110 115 -5 5 25 4.55
10 130 110 20 20 400 15.38
Total 80 850 65.35
Forecast Accuracy

 Naive Forecast Accuracy


  80
MAE= =8.89
9
  850
MSE= =94.44
9

  65.35
MAPE= =7.26 %
9
Moving Averages and
Exponential Smoothing
 Now we discuss three forecasting methods that
are appropriate for a time series with a horizontal
pattern:
Moving Averages Weighted Moving Averages

Exponential Smoothing
 They are called smoothing methods because their
objective is to smooth out the random fluctuations
in the time series.
 They are most appropriate for short-range
forecasts.
Moving Averages

The moving averages method uses the


average of the most recent k data values
in the time series. As the forecast for the
next period.
 
^ ∑ ( most recent 𝑘 data values ) 𝑌 𝑡 +𝑌 𝑡 −1 +…+𝑌 𝑡 −𝑘 +1
𝑌 =
𝑡 +1 =
𝑘 𝑘

where:   = forecast of the time series for period t + 1

Each observation in the moving average calculation


receives the same weight.
Moving Averages

 The term moving is used because every time a new


observation becomes available for the time series, it
replaces the oldest observation in the equation.
 As a result, the average will change, or move, as
new observations become available.
Moving Averages

 To use moving averages to forecast, we must first


select the order k, or number of time series values,
to be included in the moving average.
 A smaller value of k will track shifts in a time series
more quickly than a larger value of k.
 If more past observations are considered relevant,
then a larger value of k is better.
Example: Moving Average

 Example: Rosco Drugs


If Rosco Drugs uses a 3-period moving average to
forecast sales, what are the forecasts for weeks 4-11?

Week Sales Week Sales


1 110 6 120
2 115 7 130
3 125 8 115
4 120 9 110
5 125 10 130
Example: Moving Average

Week Sales 3MA Forecast


1 110
2 115 (110 + 115 + 125)/3
3 125
4 120 116.7
5 125 120.0
6 120 123.3
7 130 121.7
8 115 125.0
9 110 121.7
10 130 118.3
11 118.3
Example: Moving Average

3MA Forecast Absolute Squared Abs.%


Week Sales Forecast Error Error Error Error
1 110
2 115
3 125
4 120 116.7 3.3 3.3 10.89 2.75
5 125 120.0 5.0 5.0 25.00 4.00
6 120 123.3 -3.3 3.3 10.89 2.75
7 130 121.7 8.3 8.3 68.89 6.38
8 115 125.0 -10.0 10.0 100.00 8.70
9 110 121.7 -11.7 11.7 136.89 10.64
10 130 118.3 11.7 11.7 136.89 9.00
Total 6.6 53.3 489.45 44.22
Example: Moving Average

 3-MA Forecast Accuracy


  53.3
MAE= =7.61
7
  489.45
MSE= =69.92
7
  44.22
MAPE= =6.32 %
7

The 3-week moving average approach provided


more accurate forecasts than the naïve approach.
Weighted Moving Averages

 Weighted Moving Averages


• To use this method we must first select the number
of data values to be included in the average.
• Next, we must choose the weight for each of the
data values.
• The more recent observations are typically
given more weight than older observations.
• For convenience, the weights should sum to 1.
Weighted Moving Averages

 Weighted Moving Averages


• An example of a 3-period weighted moving average
(3WMA) is:

3WMA = .2(110) + .3(115) + .5(125) = 119

Weights (.2, .3, 125 is the most recent


and .5) sum to 1 observation
Example: Weighted Moving Average

3WMA Forecast Absolute Squared Abs.%


Week Sales Forecast Error Error Error Error
1 110
2 115
3 125
4 120 119.0 1.0 1.0 1.00 2.75
5 125 120.5 4.5 4.5 20.25 4.00
6 120 123.5 -3.5 3.5 12.25 2.75
7 130 121.5 8.5 8.5 72.25 6.38
8 115 126.0 -11.0 11.0 121.00 8.70
9 110 120.5 -10.5 10.5 110.25 10.64
10 130 115.5 14.5 14.5 210.25 9.00
Total 3.5 53.5 547.25 44.15
Example: Weighted Moving Average

 3-WMA Forecast Accuracy


  53.5
MAE= =7.64
7
  547.25
MSE= =78.18
7
  44.15
MAPE= =6.31 %
7
The 3-WMA approach (with weights of .2, .3, and .5)
provided accuracy very close to that of the 3-MA
approach for this data.
Exponential Smoothing

• This method is a special case of a weighted moving


averages method; we select only the weight for the
most recent observation.
• The weights for the other data values are computed
automatically and become smaller as the
observations grow older.
• The exponential smoothing forecast is a weighted
average of all the observations in the time series.
• The term exponential smoothing comes from the
exponential nature of the weighting scheme for the
historical values.
Exponential Smoothing

• Exponential Smoothing
Forecast
 𝑌
^ 𝑡 +1=𝛼 𝑌 𝑡 + ( 1 −𝛼 ) 𝑌^ 𝑡

where:
  = forecast of the time series for period t + 1

Yt = actual value of the time series in period t


Ft = forecast of the time series for period t
a = smoothing constant (0 < a < 1)
and let:
 𝑌
^ =𝑌
^ 1 (¿ initiate the computations)
2
Exponential Smoothing

 Exponential Smoothing Forecast


  • With some algebraic manipulation, we can rewrite
as:

 𝑌
^ =𝑌^ 𝑡 +𝛼 ( 𝑌 𝑡 − 𝑌
^𝑡)
𝑡 +1

  • We see that the new forecast is equal to the


previous forecast plus an adjustment, which is a
times the most recent forecast error, .
Exponential Smoothing

 Desirable Value for the Smoothing Constant a.


 If the time series contains substantial random
variability, a smaller value (nearer to zero) is
preferred.
 If there is little random variability present,
forecast errors are more likely to represent a
change in the level of the time series …. and a
larger value for a is preferred.
Example: Exponential Smoothing

 Example: Rosco Drugs


If Rosco Drugs uses exponential smoothing
to forecast sales, which value for the smoothing
constant , .1 or .8, gives better forecasts?

Week Sales Week Sales


1 110 6 120
2 115 7 130
3 125 8 115
4 120 9 110
5 125 10 130
Example: Exponential Smoothing

 Using Smoothing Constant Value  = .1


  = Y1 = 110
  = .1Y2 + .9 = .1(115) + .9(110) = 110.50
  = .1Y3 + .9 = .1(125) + .9(110.5) = 111.95
  = .1Y4 + .9 = .1(120) + .9(111.95) = 112.76
  = .1Y5 + .9 = .1(125) + .9(112.76) = 113.98
  = .1Y6 + .9 = .1(120) + .9(113.98) = 114.58
  = .1Y7 + .9 = .1(130) + .9(114.58) = 116.12
  = .1Y8 + .9 = .1(115) + .9(116.12) = 116.01
  = .1Y9 + .9 = .1(110) + .9(116.01) = 115.41
Example: Exponential Smoothing

 Using Smoothing Constant Value  = .8


  = = 110
  = .8(115) + .2(110) = 114.00
  = .8(125) + .2(114) = 122.80
  = .8(120) + .2(122.80) = 120.56
  = .8(125) + .2(120.56) = 124.11
  = .8(120) + .2(124.11) = 120.82
  = .8(130) + .2(120.82) = 128.16
  = .8(115) + .2(128.16) = 117.63
  = .8(110) + .2(117.63) = 111.53
Example: Exponential Smoothing (a = .1)

a = .1 Forecast Absolute Squared Abs.%


Week Sales Forecast Error Error Error Error
1 110
2 115 110.00 5.00 5.00 25.00 4.35
3 125 110.50 14.50 14.50 210.25 11.60
4 120 111.95 8.05 8.05 64.80 6.71
5 125 112.76 12.24 12.24 149.94 9.79
6 120 113.98 6.02 6.02 36.25 5.02
7 130 114.58 15.42 15.42 237.73 11.86
8 115 116.12 -1.12 1.12 1.26 0.97
9 110 116.01 -6.01 6.01 36.12 5.46
10 130 115.41 14.59 14.59 212.87 11.22
Total 82.95 974.22 66.98
Example: Exponential Smoothing (a = .1)

 Forecast Accuracy
  82.95
MAE= =9.22
9
  974.22
MSE= =108.25
9
  66.98
MAPE= =7.44 %
9

Exponential smoothing (with a = .1) provided


less accurate forecasts than the 3-MA approach.
Example: Exponential Smoothing (a = .8)

a = .8 Forecast Absolute Squared Abs.%


Week Sales Forecast Error Error Error Error
1 110
2 115 110.00 5.00 5.00 25.00 4.35
3 125 114.00 11.00 11.00 121.00 8.80
4 120 122.80 -2.20 2.20 7.84 1.83
5 125 120.56 4.44 4.44 19.71 3.55
6 120 124.11 -4.11 4.11 16.91 3.43
7 130 120.82 9.18 9.18 84.23 7.06
8 115 128.16 -13.16 13.16 173.30 11.44
9 110 117.63 -7.63 7.63 58.26 6.94
10 130 111.53 18.47 18.47 341.27 14.21
Total 75.19 847.52 61.61
Example: Exponential Smoothing (a = .8)

 Forecast Accuracy
  75.19
MAE= =8.35
9
  847.52
MSE= =94.17
9
  61.61
MAPE= =6.85 %
9
Exponential smoothing (with a = .8) provided
more accurate forecasts than ES with a = .1,
but less accurate than the 3-MA.
Optimal Smoothing Constant Value

 We choose the value of a that minimizes the


mean squared error (MSE).
 Determining the value of a that minimizes MSE is
a nonlinear optimization problem.
 These types of optimization models are often
referred to as curve fitting models.
Optimal Smoothing Constant Value

 General Formulation
  𝑛
 
2
^
𝑚𝑖𝑛 ∑ ( 𝑌 𝑡 − 𝑌 𝑡 )
𝑡=2
s.t. = + a Yt-1 + (1 – a) t = 2, 3, … n
= Y1
0<a<1
  There are n + 1 decision variables.
 The decision variables are a and .
 There are n + 1 constraints.
Example: Optimal Smoothing Constant Value

Example: Rosco Drugs


We will now formulate the optimization model for
determining the value of a that minimizes MSE.

Week Sales Week Sales


1 110 6 120
2 115 7 130
3 125 8 115
4 120 9 110
5 125 10 130
Example: Optimal Smoothing Constant Value

 Objective Function
 Theobjective function minimizes the sum of the
squared error.

Minimize { (115 – )2 + (125 – )2 + (120 – )2


+ (125 – )2 + (120 – )2 + (130 – )2
+ (115 – )2 + (110 – )2 + (130 – )2 }
Example: Optimal Smoothing Constant Value

 Constraints
 Thefollowing constraints define the forecasts as a
function of observed and forecasted values.

= 110 = + a 125 + (1 – a) = + a 110 + (1 –


a) = + a 120 + (1 – a) = + a 115 + (1 – a) =+
a 130 + (1 – a) = + a 125 + (1 – a) = + a 115 +
(1 – a) = + a 120 + (1 – a) = + a 110 + (1 – a)
Finally, the value of a is restricted to: 0 < a < 1
Example: Optimal Smoothing Constant Value

 Optimal Solution
  Smoothing constant a = 0.381
= 110 = 120.715
= 110.000 = 120.442
= 111.905 = 124.084
= 116.894 = 120.623
= 118.077 = 116.576
Example: Optimal Smoothing Constant Value

 Forecast Accuracy
  71.53
MAE= =7.95
9
  721.48
MSE= =80.16
9
  58.78
MAPE= =6.53 %
9

Exponential smoothing (with a = .381) provided


more accurate forecasts than ES with a = .1, but
slightly less accurate than the 3-MA.
End of Chapter 6, Part A
Puntos a Tratar:

• Objetivo:

1.1.
1.2.
1.3.
1.4.
1.5.
58
AAAAAAAAAABBBBBBB

1.1. XXXXXXZZZZZZ

59

Вам также может понравиться