Download as pdf or txt
Download as pdf or txt
You are on page 1of 45

Quality Engineering

4- Time series modeling via ARIMA

Bianca Maria Colosimo, PhD


[email protected]
Reference:
“Statistical Process Adjustment for Quality Control”, Del Castillo – Wiley
“Time Series Analysis – 3rd edition”, Box Jenkins Reinsel – Prentice Hall
Why ARIMA?

We previously learned how to


identify AR(p) models ((S)ACF e
(S)PACF)) and how to estimate
the coefficients (via regression)

Example: 500 consecutive


observations from a productive
process

? Identification ?

More general models: ARIMA

Quality Engineering- BM Colosimo


2
Time series and stationarity

Let {X t } be a discrete time series, then:

The time series is strictly (or strongly) stationary if its properties do not
depend on modifications of the time origin
I.e., the joint distribution of : X t1 , X t2 , X t3 ,! X tm
coincides with the joint distribution of X t1 + k , X t2 + k , X t3 + k ,! X tm + k "k

We refer to weak stationarity of order f if all the moments of the series up to order
f only depend on the time difference between the time series data

E.g.: stationarity of 2nd order: ìE ( X t ) = µ


í "t = 1,2,!
îCov( X t , X t - k ) = g k
Remind: Cov ( X t , X t - k ) = g k = E[( X t - µ )( X t - k - µ )] k = 0,±1,±2,!

Quality Engineering- BM Colosimo


3
ARMA(p,q)

2
e t ~ NID(0,s e )
General mode for stochastic model (linear)

X t = x + f1X t -1 + ... + f p X t - p - q1e t -1 - q 2e t -2 - ... - q qe t -q + e t

Most of stationary processes can be modeled by including:


- An autoregressive term of degree p - AR(p) term
- A ‘moving average’ term that links the observation at time t to
previous q random errors - MA(q) term

Let’s go into details

Quality Engineering- BM Colosimo


4
AutoRegressive Models AR(p)

X t = x + f1 X t -1 + e t e t ~ NID(0, s 2 )
Model AR(1)
(Markov process)
For a stationary process: E( X t ) = µ

E ( X t ) = x + f1E ( X t -1 ) + 0 Þ (1 - f1 ) µ = x
X t = (1 - f1 ) µ + f1 X t -1 + e t
~ ~
X t - µ = f1 ( X t -1 - µ ) + e t X t º X t - µ = f1 ( X t -1 ) + e t

Let’s introduce the backward shift (backshift) operator B : BX t = X t -1


~ ~ ~ ~
X t = f1( X t -1) + e t = f1( BX t ) + e t Þ (1 - f1B) X t = e t

Quality Engineering 5
AR(p)

Generally speaking: AR(p) model:

X t = x + f1X t -1 + f2 X t -2 + ... + f p X t - p + e t
Analogously, for a stationary process: E( X t ) = µ
p ~ p ~
(1 - å fi ) µ = x Þ X t º X t - µ = å fi X t -i + e t
i =1 i =1

Backshift operator B : X t -2 = BX t -1 = B( BX t ) = B 2 X t
p
~ i ~ ~
X t = å fi B X t + e t Þ A( B) X t = e t
i =1
p
where A( B ) = 1 - å fi B i
Quality Engineering 6 i =1
Stationarity

Im The AR(p) process is stationary if and only if the


polynomial A(B) is stable, i.e., all of its roots lie
strictly outside the unit circle in the complex
1
plane
-1 1
Re
-1

~
E.g.: AR (1) : (1 - f1 B) X t = e t
1 ì f1 < 1 stationary
A( B) = 1 - f1 B = 0 Þ B = í
f1 î f1 ³ 1 non stationary
The Jury’s test is used to test the stability of AR(p) processes
Quality Engineering 7
Some intuitive remarks

~ ~ ~ 2~
X t = f1 ( X t -1 ) + e t = f1 (f1 ( X t - 2 ) + e t -1 ) + e t = f1 X t - 2 + f1e t -1 + e t =
( ~
) 3~
= f1 f1 ( X t -3 ) + e t - 2 + f1e t -1 + e t = f1 X t -3 + f12e t - 2 + f1e t -1 + e t =
2

¥
= ... = å f1ie t -i
i =0

f1 < 1 Þ

‘random walk’ – particular type of non-stationary AR(1)


f1 =1 Þ process

Quality Engineering 8
Moments of an AR(p) process

Mean: For stationary process: E( X t ) = µ


p
x
Remind : (1 - å fi ) µ = x Þ µ = p
i =1
(1 - å fi )
i =1
0 for i ¹ j
E.g.: stationary AR(1) (|f1|<1): µ =
x E (e ie j ) =
(1 - f1 ) s e2 for i = j
Autocovariance and autocorrelation:
0
~ ~
[( )(
g k = Cov ( X t , X t -k ) = E [( X t - µ )( X t -k - µ )] = E X t X t -k = )]
[( ~ ~ ~ ~
)(
= E f1 X t -1 + f 2 X t - 2 + ... + f p X t - p + e t X t -k = )]
( ~ ~
) ( ~ ~
) (
~ ~
) (~
= E f1 X t -1 X t - k + E f 2 X t -2 X t -k + ... + E f p X t - p X t -k + E e t X t -k = )
= f1g k -1 + f 2g k - 2 + ... + f pg k - p k = 1,2,...

gk
rk = = f1r k -1 + f2 r k -2 + ... + f p r k - p k = 1,2,...,
g0
Quality Engineering 9
The same finite difference equation of the original AR(p) process applies
also to the autocovariance and autocorrelation functions

Variance: ( ~
s 2X = g 0 = f1g -1 + f 2g -2 + ... + f pg - p + E e t X t = ) (Autocovariance:
symmetric function)
= f1g 1 + f 2g 2 + ... + f pg p + s e2

Divide the terms by s 2X = g 0


s e2 s e2
1 = f1r1 + f 2 r 2 + ... + f p r p + Þ s 2X =
s 2X 1 - åip=1fi r i
E.g.: stationary AR(1) (|f1|<1):
2 s e2
g0 =sX =
1 - f1r1
f1s e2 f12s e2 f1k s e2
g k = f1g k -1 k = 1,2,... g 1 = f1g 0 = g =f g = ... g k =
1 - f1r1 2 1 1 1 - f1r1 1 - f1r1
gk
rk = k = 1,2,..., r1 = f1 r 2 = f12 ... r k = f1k
g0
Quality Engineering 10
For a stationary AR(1) process, the autocorrelation function (ACF)
“geometrically decays” (if time was ‘continuous’, the decay would be
k
exponential) r k = f1

SACF : rˆ k = rk
(f1 = 0.8)

SACF : rˆ k = rk
(f1 = -0.8)

Quality Engineering 11
Note: It’s a general result, true for every stationary AR(p) process

How to identify the order p of an AR(p) process?


à Partial AutoCorrelation Function
~ ~ 2~ 3~
X t = f1 ( X t -1 ) + e t = f1 X t -2 + f1e t -1 + e t = f1 X t -3 + f12e t -2 + f1e t -1 + e t

~ ~ ~ ~
Xt X t -1 X t -2 X t -3

f1 f1 f1

f12 If ones tries to estimate the coefficients f1i


of the AR(1) model in an incremental way:
X! t = f1 ( X! t -1 ) + e t ~ ~ ~
X t = f11 X t -1 + f12 X t -2 + e t
! 2 !
X t = f1 ( X t -2 ) + f1e t -1 + e t
Quality Engineering 12 0
Example: AR(1)

X t = x + f1X t -1 + e t x = 0.5 f1 = 0.8

Regression Analysis: Y(t) versus Y(t)_1


The regression equation is Y(t) = 0.792 + 0.773 Y(t)_1
Predictor Coef SE Coef T P
Constant 0.7917 0.2337 3.39 0.001
Y(t)_1 0.77308 0.06212 12.44 0.000
S = 1.064 R-Sq = 61.2% R-Sq(adj) = 60.9%

Quality Engineering 13
Regression Analysis: Y(t) versus Y(t)_2
The regression equation is: Y(t) = 1.52 + 0.565 Y(t)_2
Predictor Coef SE Coef T P
Constant 1.5199 0.3070 4.95 0.000
Y(t)_2 0.56527 0.08198 6.90 0.000

S = 1.396 R-Sq = 32.9% R-Sq(adj) = 32.2%

Regression Analysis: Y(t) versus Y(t)_1, Y(t)_2


The regression equation is
Y(t) = 0.858 + 0.848 Y(t)_1 - 0.096 Y(t)_2
Predictor Coef SE Coef T P
Constant 0.8578 0.2485 3.45 0.001
Y(t)_1 0.8482 0.1020 8.31 0.000
Y(t)_2 -0.0958 0.1014 -0.94 0.347
S = 1.070 R-Sq = 61.0% R-Sq(adj) = 60.2%

Quality Engineering 14
@
Regression Analysis: Y(t) versus Y(t)_1, Y(t)_2
The regression equation is
Y(t) = 0.858 + 0.848 Y(t)_1 - 0.096 Y(t)_2

Quality Engineering 15
Non stationary AR(1) process: random walk with drift

X t = x + X t -1 + e t
x ¹ 0 :" drift "
X t = x + X t -1 + e t = x + x + X t -2 + e t -1 + e t =
t -1 i
= 3x + X t -3 + e t -2 + e t -1 + e t = ... = tx + å B e t
i =0
E ( X t ) = µt = tx
V ( X t ) = ts e2
g
It’s possible to demonstrate that, being X0=0: k ,t = ts e and ρk = 1 "k
2

Thus, the ACF does not exhibit a decreasing trend.


Generally speaking, if the ACF (SACF) does not exhibit decreasing
trend (or a very slowly decreasing trend) the process is not stationary
Quality Engineering 16
Example: Down Jones

(dow.dat)

Regression Analysis: dow versus dow_1


The regression equation is
dow = 75.7 + 0.967 dow_1
199 cases used 1 cases contain missing values
Predictor Coef SE Coef T P
Constant 75.73 38.07 1.99 0.048
dow_1 0.96703 0.01772 54.56 0.000
S = 74.52 R-Sq = 93.8% R-Sq(adj) = 93.8%
Quality Engineering 17
AR(2):
X t = x + f1 X t -1 + f2 X t -2 + e t

Stationarity conditions (from Jury’s test):


Demonstrate that: f2 < 1 f1 + f2 < 1 f2 - f1 < 1
x
µ=
(1 - f1 - f 2 ) gk
rk = = f1 r k -1 + f2 r k - 2 + ... + f p r k - p k = 1,2,...,
2 g0
s e
s 2X =
1 - f1 r1 - f 2 r 2
ì r1 (1 - r 2 )
ïf1 = 2
ì r1 = f1 + f 2 r1 ï 1 - r 1
í Þí 2
î r 2 = f1 r1 + f 2 ïf = 2 r - r 1
ï 2 2
î 1 - r1
Remark: point estimate of AR(2) process parameters can be achieved
through autocorrelation coefficients estimate for lags 1 and 2 –
extendable to AR(p) procsses
Quality Engineering- BM Colosimo
18
Moving Average models: MA(q)

When the process can be represented by a weighted average of random


shocks, et, the process is referred to as a moving average process

MA ( q ) : X t = µ - q1e t -1 - q 2e t - 2 - ... - q q e t - q + e t
~
X t = X t - µ = -q1e t -1 - q 2e t -2 - ... - q q e t -q + e t

“moving average”: observation at time t can be estimate as average of


random shock at time t (et) and its previous q shocks
(even though the term is improper, as the sum of weights qi is not necessarily equal to 1)

~
( )
With the backshift operator: X t = 1 - q1B - q 2 B 2 - ... - q q B q e t º C ( B)e t

MA(q) processes are always stationary


Quality Engineering 19
Invertibility of a MA(q) process – (briefly)

Note: the invertibility concept for MA(q) processes is analogous to the stationarity
concept for AR(p) processes

A time series process is invertible if it can be expressed as an AR process as follows:


~ ~ ~
X t = p1 X t -1 + p 2 X t -2 + ... + e t
Where the sum is allowed to have an infinite number of terms, but it must converge
to a finite value
~
Example - MA(1): X t = -q1e t -1 + e t = (1 - q1B)e t
1 ~ ¥ i ~
et = X t = å q1 B i X t
1 - q1B i =0

Invertibility condition for a MA(1) process: | q1 |< 1

Quality Engineering 20
Moments of a MA(q) process
Mean:
E( X t ) = E(µ -q1e t -1 -q 2e t -2 - ... -q qe t -q + e t ) = µ
Variance: g 0 = s 2X = V ( X t ) = V ( µ - q1e t -1 - q 2e t -2 - ... - q qe t -q + e t ) =
= (1 + q12 + q 22 + ... + q q2 )s e2

Autocovariance: g k = Cov( X t , X t -k ) = E[( X t - µ )( X t -k - µ )] = E X t X t -k = (~ ~


)
[( )(
= E e t - q1e t -1 - ... - q qe t -q e t -k - q1e t -k -1 - ... - q qe t -k -q )]
0 for i ¹ j
E (e ie j ) =
s e2 for i = j
per k < q
for
= E éë( e t - q1e t -1 - ... - q k e t - k - q k +1e t -( k +1) - ... - q qe t -q )

( e t -k - q1 e t -( k +1) - ... - q q -k e t -( k + q -k ) - ... - q q -1e t -( k + q -1) - q qe t -( k + q ) ) ùû


ì (-q k + q1q k +1 + q 2q k + 2 + ... + q q - kq q )s e2 k = 1, 2,...,q
gk = í
î0 k >q

Quality Engineering 21
Autocorrelation:
ì (-q k + q1q k +1 + q 2q k + 2 + ... + q q -kq q )
gk ï k = 1,2,...,q
rk = = í 1 + q12 + q 22 + ... + q q2
g0 ï
î0 k >q
By means of the SACF one can identify a MA process and its order q,
depending on the number of lags such that rk¹0
Special cases:
ì - q1
ï k =1
MA(1): X t = µ - q1e t -1 + e t E( X t ) = µ g 0 = (1 + q12 )s e2 r k = í1 + q 2
1
ï0 k >1
î
ì - q1 + q1q 2
ï k =1
2 2
MA(2): X t = µ -q1e t -1 -q 2e t -2 + e t ï 1 + q1 + q 2
ï -q 2
rk = í k=2
E( X t ) = µ g 0 = (1 + q12 + q 22 )s e2 2
ï1 + q1 + q 2
2
ï0 k>2
Quality Engineering 22 ï
ïî
Initial example: it was a MA(1) process

MA(1)

Example: 500
consecutive
observations from a
productive process

Quality Engineering 23
ARMA(p,q) models

A model that includes both the AR(p) and the MA(q) terms:

X t = x + f1X t -1 + ... + f p X t - p - q1e t -1 - q 2e t -2 - ... - q qe t -q + e t

X! t = X t - µ = f1 X! t -1 + ... + f p X! t - p - q1e t -1 - q 2e t - 2 - ... - q qe t - q + e t

(1 - f1B - ... - f p B p ) X! t = (1 - q1B - ... - q q B q )e t


~
A( B) X t = C( B)e t
ARMA process is stationary if its AR term is stationary (it is invertible
if its MA term is invertible)

Moments of an ARMA(p,q) process


x
Mean: E( X t ) = µ =
p
1 - å fi
Quality Engineering
i =1
24
~
Autocovariance and autocorrelations (obtained by multiplying by X t - k ):

g k = f1g k -1 + f2g k - 2 + ... + f pg k - p k ³ q +1


gk
rk = = f1r k -1 + f2 r k - 2 + ... + f p r k - p k ³ q +1
g0

Apart from the first q lags, the ACF “resembles” the ACF of an AR(p)
process.

Generally speaking, one tries to fit an ARMA(p,q) model after trying


to fit (unsuccessfully – residual diagnostics) either an AR and/or an
MA ‘pure’ model
Quality Engineering 25
ARIMA (p,d,q) models

Most industrial processes are not stationary: when no control action is


applied, the process mean tends to departe from the target
A non-stationary ARIMA model exhibit a stationary/non stationary behaviour that
depends on the AR term of the model (roots of A(B) polynomial). Particularly:
1. Roots lie strictly outside the unit circle in complex plane: stationarity
2. Roots lie strictly inside the unit circle in complex plane: ‘explosive’ non
stationarity
3. Roots lie on the unit circle in complex plane: ‘homogeneous’ non stationarity

Case 2 is pretty rare in production processes.

~ d ~
A( B) X t = A p ' ( B)(1 - B) X t = C q ( B )e t
Where:
• Ap’(B) is the degree p polynomial (AR term) with all roots falling outside the unit
circle;
• d is the degree of the I term (integrated) within the ARIMA model
• Cq(B) is the degree q polynomial of MA term
Quality Engineering 26
In order to deal with ARIMA (p,d,q) processes, one has to transform the
process into a stationary one by applying the difference operator (nabla)
ÑX t º X t - X t -1 = (1 - B ) X t

By inverting the Ñ operator t X 1


SX t º å X i = (1 + B + B 2 + B3 + ...) X t = t = X t = Ñ -1 X t
one gets the sum operator: i = -¥ 1- B Ñ
Under the assumption of infinite number of elements. In the ‘continuous’ domain, integrals
replaces sums: this is where the name ‘Integrated’ in ARIMA model originates from

Example 1: random walk


X t = X t -1 + e t
ÑX t = X t - X t -1 = e t The process becomes a white noise (stationary)

Example 2: ARIMA(0,1,1)=IMA(1,1)
A process that often occurs in industial applications is IMA(1,1)

Quality Engineering
(1 - B27) X t = -q1e t -1 + e t
Development of an ARIMA(p,d,q) model

Iterative procedure:
Identification Finding the values of p,d,q
Coefficients estimate
no (fˆ1,..., fˆ p ,qˆ1,...,qˆq )
Diagnostic check on residuals

yes STOP
OK?

Identification of an ARIMA(p,d,q) process


Goal: finding values of p,d,q from a given time series {xt}
One has to use the SACF (Sample Auto Correlation Function) and the
SPACF (Sample Partial Autocorrelation Function)
Quality Engineering 28
Identification: d parameter

1. Non-stationarity (I term): inferred when the SACF does not


exhibit an exponential decay (e.g., linear decay)
2. If the process seems to be non-stationary: apply difference
operator. If the resulting time series still exhibit a non-stationary
behaviour, iterative application of difference operator is required

Pay attention to “overdifferencing”:


Simple way to detect “overdifferencing” consists of computing:

Var ( xt ),Var (Ñxt ),...,Var (Ñ n xt )

Choose d such that the variance of the series Ñ d xt is minimized

Quality Engineering 29
Identification of other parameters (p,q)

1. Check the SACF and SPACF patterns, reminding that:

• AR(p): the SACF shows “exponential decay” whereas the


SPACF is used to choose the degree p
• MA(q): the SACF is used to choose the degree q whereas the
SPACF shows “exponential decay”

• ARMA(p,q) resembles an AR(p) after q lags.

Attention: fundamental principle (Box Jenkins): “PARSIMONY”


Most processes can be represented in terms of time series having low values of p and q
– they are easier to interpret
Values of p or q equal to (or larger than) 3 are rarely observed in practise

Quality Engineering 30
After the identification step (p,d,q) – coefficients estimation

Maximum likelihood method (not covered in this course) –


implemented in Minitab

Note:
We noticed that regression (least squares method) can be used for
AR(p) models too:
When n is large enough, the two methods converge to the same
results

After coefficients estimation – Residuals computation and


diagnostics
1. LBQ test
2. Normality test (if required for residual chart – SCC – presented
in next slides)
Quality Engineering 31
Starting time series xt To summarize
(Box Jenkins approach)
SACF; d=0

Stationary?
no
yes d:=d+1
Apply Ñd

SACF-SPACF: choose
ARIMA model to test
Find a
model for
residuals* Coefficients estimation
and residuals computation

no yes
Residuals are IID (NID)? STOP
Quality Engineering 32
* One can try to identify and fit an ARMA(pe,qe) model on residuals

Aˆ pe ( B)et = Cˆ qe ( B)at

If the model is correct (at= IID), such a model can be combined with the
original ARIMA(p,d,q) model, as follows:
d ~ Cˆ qe ( B )
A( B )Ñ X t = Cˆ ( B )et
ˆ et = at
ˆ
A ( B)
pe
ˆ ( B)
C
ˆ d ~ ˆ qe
A( B )Ñ X t = C ( B ) at
ˆ
A pe ( B )
d ~
A( B ) A pe ( B )Ñ X t = Cˆ ( B )Cˆ qe ( B )at
ˆ ˆ

The result is an ARIMA(p+pe,d,q+qe) – pay attention to the


parsimony principle
Quality Engineering 33
Example: viscosity data
35

viscosity
Viscosity of a chemical product 25

(source: Montgomery D.C., Johnson, L.A., Gardiner, J.S.,


Forecasting and Time Series Analysis – McGraw-Hill) 15
Index 10 20 30 40 50 60 70 80 90 100

Runs Test: viscosity Stationarity seems ok


viscosit
K = 28.5696
The observed number of runs = 37
The expected number of runs = 50.0200
57 Observations above K 43 below
The test is significant at 0.0076

Quality Engineering 34
Autocorrelation Function for viscosity
1.0

Autocorrelation
0.8
0.6
0.4
0.2
0.0
-0.2
-0.4
-0.6
-0.8
-1.0

5 15 25

Lag Corr T LBQ Lag Corr T LBQ Lag Corr T LBQ Lag Corr T LBQ

1 0.49 4.94 25.13 8 -0.01 -0.08 51.19 15 -0.12 -0.87 57.41 22 -0.04 -0.30 68.37
2 -0.05 -0.41 25.39 9 -0.09 -0.66 52.16 16 -0.11 -0.78 58.95 23 -0.17 -1.12 72.13
3 -0.26 -2.16 32.73 10 -0.11 -0.75 53.44 17 -0.09 -0.62 59.96 24 -0.19 -1.22 76.74
4 -0.28 -2.22 41.25 11 -0.07 -0.53 54.08 18 0.00 0.03 59.96 25 -0.06 -0.41 77.27
5 -0.07 -0.54 41.82 12 0.03 0.22 54.19 19 0.16 1.08 63.06
6 0.22 1.63 46.99 13 0.01 0.09 54.21 20 0.18 1.24 67.29
7 0.20 1.42 51.18 14 -0.11 -0.74 55.54 21 0.08 0.53 68.12

• Stationarity seems ok
• SACF seems to exhibit an exponential decay
Quality Engineering 35
Partial Autocorrelation Function for viscosity
Partial Autocorrelation
1.0
0.8
0.6
0.4
0.2
0.0
-0.2
-0.4
-0.6
-0.8
-1.0

5 15 25

Lag PAC T Lag PAC T Lag PAC T Lag PAC T

1 0.49 4.94 8 -0.06 -0.62 15 -0.02 -0.22 22 -0.03 -0.27


2 -0.39 -3.88 9 0.06 0.64 16 -0.10 -1.00 23 -0.07 -0.65
3 -0.06 -0.60 10 -0.02 -0.18 17 -0.05 -0.52 24 -0.04 -0.42
4 -0.15 -1.50 11 -0.03 -0.34 18 -0.00 -0.05 25 -0.05 -0.46
5 0.13 1.33 12 -0.00 -0.03 19 0.16 1.64
6 0.16 1.62 13 -0.11 -1.05 20 0.05 0.53
7 -0.13 -1.26 14 -0.07 -0.71 21 -0.04 -0.36

Let’s try with an AR(2)


Quality Engineering 36
ARIMA Model: viscosity
ARIMA model for viscosity
Final Estimates of Parameters
Type Coef SE Coef T P
AR 1 0.7187 0.0923 7.79 0.000
AR 2 -0.4344 0.0922 -4.71 0.000
Constant 20.5061 0.3279 62.53 0.000
Mean 28.6516 0.4582
Number of observations: 100
Residuals: SS = 1042.78 (backforecasts excluded)
MS = 10.75 DF = 97
Modified Box-Pierce (Ljung-Box) Chi-Square statistic
Lag 12 24 36 48
Chi-Square 12.2 21.1 28.3 38.0
DF 9 21 33 45
P-Value 0.202 0.451 0.701 0.759

Estimated model X t = 20.506 + 0.7187 X t -1 - 0.4344 X t - 2


Quality Engineering 37
Estimated model X t = 20.506 + 0.7187 X t -1 - 0.4344 X t - 2

Regression Analysis: viscosity versus viscosity_1, viscosity_2


The regression equation is
viscosity = 20.1 + 0.707 viscosity_1 - 0.406 viscosity_2
98 cases used 2 cases contain missing values

Predictor Coef SE Coef T P


Constant 20.081 2.613 7.68 0.000
Viscos_t-1 0.70672 0.09112 7.76 0.000
Viscos_t-2 -0.40594 0.09119 -4.45 0.000

S = 3.223 R-Sq = 38.9% R-Sq(adj) = 37.6%

Quality Engineering 38
Autocorrelation Function for RESI1
1.0
Autocorrelation 0.8
0.6
0.4
0.2
0.0
-0.2
-0.4
-0.6
-0.8
-1.0

5 15 25

Lag Corr T LBQ Lag Corr T LBQ Lag Corr T LBQ Lag Corr T LBQ

1 -0.02 -0.25 0.06 8 -0.12 -1.09 9.89 15 -0.05 -0.45 15.67 22 0.00 0.04 18.58
2 -0.02 -0.22 0.11 9 -0.01 -0.11 9.90 16 -0.05 -0.44 15.97 23 -0.06 -0.49 19.00
3 0.11 1.08 1.33 10 0.01 0.10 9.92 17 -0.03 -0.26 16.07 24 -0.13 -1.09 21.13
4 -0.09 -0.91 2.25 11 -0.14 -1.25 12.03 18 -0.04 -0.35 16.27 25 -0.08 -0.71 22.08
5 -0.13 -1.29 4.12 12 0.04 0.34 12.20 19 0.11 0.96 17.76
6 0.17 1.68 7.43 13 0.03 0.24 12.28 20 0.08 0.69 18.56
7 0.09 0.86 8.36 14 -0.16 -1.46 15.36 21 -0.01 -0.09 18.58 Normal Probability Plot

.999
.99
.95

Probability
.80

.50
.20
.05
.01
.001

-5 0 5
RESI1
Quality Engineering 39 Average: -0.0396777
StDev: 3.24524
Anderson-Darling Normality Test
A-Squared: 0.966
N: 100 P-Value: 0.014
SCC

FVC

Quality Engineering 40
Example
7500

100 data – weekly demand of plastic


containers from plastic injection 6500

weekly demand
moulding (drug production sector)
5500

(source: Montgomery D.C., Johnson, L.A., Gardiner, J.S.,


Forecasting and and Time Series Analysis – McGraw-Hill) 4500

Index 10 20 30 40 50 60 70 80 90 100

Autocorrelation Function for weekly deman


1.0
Autocorrelation

0.8
0.6
0.4
0.2
0.0
-0.2
-0.4
-0.6
It looks like a non-
-0.8
-1.0 stationary process
5 15 25

Lag Corr T LBQ Lag Corr T LBQ Lag Corr T LBQ Lag Corr T LBQ

1 0.96 9.61 95.13 8 0.48 1.58 466.81 15 0.08 0.23 520.48 22 -0.13 -0.38 524.39
2 0.90 5.32 179.07 9 0.41 1.32 486.06 16 0.05 0.14 520.75 23 -0.16 -0.47 527.61
3 0.83 3.93 251.61 10 0.34 1.08 499.54 17 0.02 0.05 520.78 24 -0.17 -0.53 531.68
4 0.76 3.14 312.62 11 0.28 0.87 508.60 18 -0.01 -0.02 520.79 25 -0.18 -0.54 535.97
5 0.69 2.61 363.79 12 0.23 0.70 514.59 19 -0.03 -0.08 520.87
6 0.63 2.23 406.78 13 0.17 0.53 518.10 20 -0.05 -0.17 521.26
7 0.56 1.89 440.93 14 0.12 0.36 519.78 21 -0.09 -0.28 522.34
Quality Engineering 41
Let’s apply the Ñ operator: 500

diff dem
0

-500

Index 10 20 30 40 50 60 70 80 90 100

After differencing:
MA(1)

Quality Engineering 42
Thus, model IMA(1,1):
~ d ~
A( B ) X t = A p ' ( B )(1 - B ) X t = Cq ( B )e t
ARIMA Model: weekly demand
Final Estimates of Parameters
Type Coef SE Coef T P
MA 1 -0.7331 0.0688 -10.66 0.000

Differencing: 1 regular difference


Number of observations: Original series 100, after differencing 99
Residuals: SS = 2405478 (backforecasts excluded)
MS = 24546 DF = 98

Modified Box-Pierce (Ljung-Box) Chi-Square statistic


Lag 12 24 36 48
Chi-Square 21.6 41.1 67.8 89.7
DF 11 23 35 47
P-Value 0.028 0.012 0.001 0.000
Quality Engineering 43
400

300

200

100
RESI3

-100

-200

-300

-400

Index 10 20 30 40 50 60 70 80 90 100 Normal Probability Plot

.999
.99
.95

Probability
.80

.50
.20
.05
.01
.001

-400 -300 -200 -100 0 100 200 300 400


RESI3
Average: 2.31596 Anderson-Darling Normality Test
StDev: 156.653 A-Squared: 0.383
N: 99 P-Value: 0.390

Quality Engineering 44
SCC

FVC

Quality Engineering 45

You might also like