You are on page 1of 14

1

2. STATIONARY PROCESSES
We shall describe the fundamental concepts in theory of time series models.
Introduce the concepts of stochastic processes, mean and covariance functions,
stationary processes and autocorrelation functions.

2.1 Stochastic Processes
The sequence of random variables { } : 0, 1, 2,......
t
Y t = is a stochastic process and
the observed series
1 2
, .....,
n
y y y is a regarded as a realisation of the underlying
process; where the ' Y s will have some joint distribution.

Information in these joint distributions can be described in terms of means,
variances, and covariances
We focus on the first and second moments

2.2 Mean, Variances, Covariances
(i) The process { } : 0, 1, 2,......
t
Y t = , the mean function is defined by
( )
t t
E Y = for 0, 1, 2,.... t = (2.2.1)

t
is the expected value of the process at time ' ' t
In general,
t
can be different at each time point ' ' t

(ii) The autocovariance function,
, t s
is defined as
( )
,
,
t s t s
Cov Y Y = for , 0, 1, 2,... t s = (2.2.2)

where ( ) ( )( ) ( ) ,
t s t t s s t s t s
Cov Y Y E Y Y E YY ( = =



(iii) The autocorrelation function,
, t s
is given by
( )
,
,
t s t s
Corr Y Y = (2.2.3)

where
( )
( )
( ) ( )
,
, ,
,
,
t s t s
t s
t t s s
t s
Cov Y Y
Corr Y Y
Var Y Var Y


= = (2.2.4)
2

(iv) Recall that both covariance and correlation are measures of the linear
dependence between random variables but that the unitless correlation is
somewhat easier to interpret.

(v) The following properties are from known results and our definitions:

( )
, t t t
Var Y =
,
1
t t
=


, , t s s t
=
, , t s s t
=



, , , t s t t s s

,
1
t s



(vi) If
1 2
, ,...,
m
c c c and
1 2
, ,...,
n
d d d are constants, and
1 2
, ,...,
m
t t t and
1 2
, ,...,
n
s s s
are time points, then
( )
1 1 1 1
, ,
i j i j
m n m n
i t j s i j t s
i j i j
Cov c Y d Y c d Cov Y Y
= = = =
| |
= |
|
\

(2.2.6)


(vii)
( ) ( )
1
2
1 1 2 1
2 ,
i i i j
m m n i
i t i t i j t s
i i i j
Var c Y c Var Y c c Cov Y Y

= = = =
| |
= +
|
\

(2.2.7)











(2.2.5)
3

2.3 Stationarity
(a) A process { }
t
Y is said to be strongly stationary or strictly stationary if the
joint distribution of
1 2
, ,....,
n
t t t
Y Y Y is the same as the joint distribution of
1 2
, ,....,
n
t k t k t k
Y Y Y

for all sets of time points
1 2
, ,...,
n
t t t and lag ' ' k

(b) If a process is strictly stationary and has finite variance, then the covariance
function must depend only on the time lag

(c) The distribution of
t
Y is the same as that of
t k
Y

, t k i.e. the ' Y s are


marginally identically distributed.
( ) ( )
t t k
E Y E Y

= , t k mean function is constant for all time


( ) ( )
t t k
Var Y Var Y

= , t k variance constant over time



(d) The bivariate distribution of
t
Y and
s
Y is the same as that of
t k
Y

and
s k
Y


( ) ( ) , ,
t s t k s k
Cov Y Y Cov Y Y

= , , t s k
Putting k s = and then k t = we get:-
( )
, 0
,
t s t s
Cov Y Y

=
( )
0
,
s t
Cov Y Y

=

( ) 0
,
t s
Cov Y Y

=

0, t s

=
The covariance between
t
Y and
s
Y depends on time only through the time
difference t s and not on the actual times ' ' t and ' ' s

(e) Thus for a stationary process:
( ) ,
k t t k
Cov Y Y

=
( )
0
,
k
k t t k
Corr Y Y

= =


4

The general properties as stated in eqn (2.2.5) now become:
( )
0 t
Var Y =
0
1 =

k k

=
k k

=

0 k
1
k



A process { }
t
Y is weakly stationary or second order stationary if
(i). ( )
t
E Y t =
(ii). ( ) ,
t t k k
Cov Y Y

=

A strictly stationary process is weakly stationary
The sequence { } ,
k
k is called the autocovariance function
The sequence { } ,
k
k is called the autocorrelation function (ACF)










(2.2.8)
5

Illustration 1 : Random Walk
Let
1 2
, ,..., e e be a sequence of independent, identical distributed random variables
each with mean zero and variance
2
e
. The observed time series, { } : 1, 2,....
t
Y t = is
constructed as follows:

1 1
2 1 2
1 2

....
t t
Y e
Y e e
Y e e e
=

= +


`

= + + +
)



Find
(a) mean function, (b) variance, (c) autocovariance function, (d) autocorrelation
function





6

Illustration 2: Moving Average
Suppose that
1
2
t t
t
e e
Y

+
= where ' e s are iid with zero mean and variance
2
e

Find
(a) mean function, (b) variance, (c) autocovariance function, (d) autocorrelation
function




























7

Example 2.3.1
Consider the series
1 t t t
Y Y Z

= + , where { }
t
Z is a white noise process. This series is
known as a Markov process or autoregressive series of order 1. When 1 = , we have
a random walk.
Find the autocovariance function and autocorrelation function.


8

Example 2.3.2
By considering first and second moments (i.e. means, variances, covariances)
investigate whether each of the following processes is stationary (second-order). We
use { }
t
Z is a white noise process.
(i).
1 t t t
Y Y Z

= + (what is this process usually known as?)


(ii).
1 t t t
Y Y Z

= + + ( ) 0 what kind of process is this?


(iii).
1 t t t
Y Y Z

= +
( )
1 < what kind of process is this?
(iv).
1 2 t t t t
Y Z Y Z

= + where
2
1
z
=



9

Example 2.3.3
Let { }
t
e be a zero mean white noise process. Suppose that the observed process is
given by
1 t t t
Y e e

= + , where is either 3 or
1
3
.
Find the autocorrelation function (acf) for
t
Y when 3 = and when
1
3
=

10

Example 2.3.4
Suppose 5 2
t t
Y t Z = + + , where { }
t
Z is a zero-mean stationary series with auto-
covariance function
k

(a) Find the mean function for { }
t
Y
(b) Find the auto-covariance function for { }
t
Y
(c) Is { }
t
Y stationary? Why or why not?


11

Example 2.3.5
Suppose that { }
t
Y is stationary with auto-covariance function
k
.
(i) Show that
1 t t t t
W Y Y Y

= = is stationary by finding the mean and


autocovariance function for
t
W
(ii) Show that
t t
U W = is stationary


12

2.4 General Linear Process
Let { }
t
Y denote the observed time series; and let { }
t
Z represent an unobserved
white noise series, that is, a sequence of identically distributed, zero-mean,
independent random variables. The assumption of independence could be replaced
by the weaker assumption, that the { }
t
Z are uncorrelated random variables, but we
will not pursue that slight generality.

A general linear process, { }
t
Y is one that can be represented as a weighted linear
combination of present and past white noise terms as-


1 1 2 2
...
t t t t
Y Z Z Z

= + + + (2.4.1)

Since this is an infinite series, certain conditions must be placed on the -weights to
be meaningful mathematically.
We assume that

2
1
i
i

=
<

(2.4.2)
Since { }
t
Z is unobservable, without loss of generality(wlog) of equation (2.4.2), we
assume that the coefficient on
t
Z is 1; hence,
0
1 =

Now ' s form an exponentially decaying sequence :-

j
j
= where 1 1 < <


So:
2
1 2
...
t t t t
Y Z Z Z

= + + +

For example:
( )
( )
2
1 2
... 0
t t t t
E Y E Z Z Z

= + + + =
Thus, { }
t
Y has a constant mean of zeroes.

13


14

Likewise, ( )
2
2
,
1
k
e
t t k
Cov Y Y

and ( ) ,
k
t t k
Corr Y Y

=

The process is stationary:- the autocovariance structure depends only on time lag.

For a general linear process,
1 1 2 2
...
t t t t
Y Z Z Z

= + + + , similar calculation yield
the following results:

( ) 0
t
E Y =

( )
2
0
,
k t t k e i i k
i
Cov Y Y

+
=
= =

0 k with
0
1 =

A process with a nonzero mean, , may be obtained by adding to the RHS of
equation (2.4.1). Since the mean does not affect the covariance properties of a
process, and we assume a zero mean until we begin fitting models to data.

You might also like