Introduction to Stationary Process

Yung-Sheng Lu

Mar 7, 2017

@NCKU-CSIE

Outline

  • Time Series

  • Stationarity

  • Stationary Process

  • References

Time Series

Definition

  • A time series is a variable     indexed by the time   :

  • Examples:

    • The time    can be annual, monthly, daily...

      • Let be      the annual GDP.

      • Let      be the monthly temperature

      • Let      be the daily stocks

X
XX
t
tt
X_t
XtX_t
X_t
XtX_t
Y_t
YtY_t
Z_t
ZtZ_t
t
tt

Some Distinctions

  • The time series can:

    • Take discrete or continuous values.

    • Be measured at discrete or continuous time.

    • Be measured at regular or irregular intervals.

Description

  • To describe a time serie, we will concentrate on:

    • Data Generating Process (DGP)

    • The joint distribution of its elements

    • Its "moments":

      • Expected value

      • Variance

      • Autocovariance or autocorrelation of       order
         

\mathrm{E}[X_t] \equiv \mu _t
E[Xt]μt\mathrm{E}[X_t] \equiv \mu _t
\mathrm{var}[X_t] \equiv \sigma ^ 2 \equiv \gamma _0
var[Xt]σ2γ0\mathrm{var}[X_t] \equiv \sigma ^ 2 \equiv \gamma _0
k
kk
\mathrm{cov}[X_t , X_{t-k}] \equiv \gamma _k
cov[Xt,Xtk]γk\mathrm{cov}[X_t , X_{t-k}] \equiv \gamma _k
X_t
XtX_t

Description (cont.)

  • Definition of Variance

     

  • Definition of Autocovariance
     

     

  • Definition of autocorrelation of order

\mathrm{var}[X_t] \equiv \mathrm{E}[(X_t - \mathrm{E}[X_t]) ^ 2]
var[Xt]E[(XtE[Xt])2]\mathrm{var}[X_t] \equiv \mathrm{E}[(X_t - \mathrm{E}[X_t]) ^ 2]
k
kk
\mathrm{corr}[X_t , X_{t-k}] \equiv \dfrac{\mathrm{cov}[X_t , X_{t-k}]}{\sqrt{\mathrm{var}[X_t]} \sqrt{\mathrm{var}[X_{t-k}]}}
corr[Xt,Xtk]cov[Xt,Xtk]var[Xt]var[Xtk]\mathrm{corr}[X_t , X_{t-k}] \equiv \dfrac{\mathrm{cov}[X_t , X_{t-k}]}{\sqrt{\mathrm{var}[X_t]} \sqrt{\mathrm{var}[X_{t-k}]}}
\gamma _0 (t)
γ0(t)\gamma _0 (t)
\gamma _k (t)
γk(t)\gamma _k (t)
\mathrm{cov}[X_t , X_{t-k}] \equiv \mathrm{E}[(X_t - \mathrm{E}[X_t])(X_{t-k} - \mathrm{E}[X_{t-k}])]
cov[Xt,Xtk]E[(XtE[Xt])(XtkE[Xtk])]\mathrm{cov}[X_t , X_{t-k}] \equiv \mathrm{E}[(X_t - \mathrm{E}[X_t])(X_{t-k} - \mathrm{E}[X_{t-k}])]

Stationarity

Stationary Series

  • The mean of the series should not be a function of time rather should be a constant.

  • The variance of the series should not be a function of time.

Stationary Series (cont.)

  • The covariance of the i-th term and the (i + m)-th term should not be a function of time.

Stationary Series (cont.)

  • Example
    Imagine a girl moving randomly on a giant chess board. In this case, next position of the girl is only dependent on the last position.

Random Walk

  • Questions

    • How to predict the position with time?

    • How accurate will be?

Random Walk (cont.)

  • The randomness brings at every point in time.

X_t = X_{t-1} + Er(t)
Xt=Xt1+Er(t)X_t = X_{t-1} + Er(t)
X
XX
  • Recursively fit in all the 

X_t = X_0 + \sum \limits_{t = 1} ^n {Er(t)}
Xt=X0+t=1nEr(t)X_t = X_0 + \sum \limits_{t = 1} ^n {Er(t)}

Random Walk (cont.)

  • Is the mean constant?



     
    • Expectation of any error will be 0 as it is random.

E[x(t)] = E[x(0)] + \sum \limits_{t = 1} ^ n {E[Er(t)]}
E[x(t)]=E[x(0)]+t=1nE[Er(t)]E[x(t)] = E[x(0)] + \sum \limits_{t = 1} ^ n {E[Er(t)]}
\Rightarrow E[x(t)] = E[x(0)] \in \mathbf {R}
E[x(t)]=E[x(0)]R\Rightarrow E[x(t)] = E[x(0)] \in \mathbf {R}

Random Walk (cont.)

  • Is the variance constant?





    • The random walk is not a stationary process as it has a time variant variance.
    • The covariance is also dependent on time.
\mathrm{var}[x(t)] = \mathrm{var}[x(0)] + \sum \limits_{t = 1} ^ n {\mathrm{var}[\mathrm{err}(t)]}
var[x(t)]=var[x(0)]+t=1nvar[err(t)]\mathrm{var}[x(t)] = \mathrm{var}[x(0)] + \sum \limits_{t = 1} ^ n {\mathrm{var}[\mathrm{err}(t)]}
\Rightarrow \mathrm{var}[x(t)] = t \times \mathrm{var}[\mathrm{err}(t)]
var[x(t)]=t×var[err(t)]\Rightarrow \mathrm{var}[x(t)] = t \times \mathrm{var}[\mathrm{err}(t)]

Random Walk (cont.)

Stationary Process

  • Coefficient

X_t = R \times X_{t-1} + Er(t)
Xt=R×Xt1+Er(t)X_t = R \times X_{t-1} + Er(t)
  • Stationary series with           .
R = 0
R=0R = 0

Coefficient   

R
RR
R
RR
X_t = R \times X_{t-1} + Er(t)
Xt=R×Xt1+Er(t)X_t = R \times X_{t-1} + Er(t)
  • Stationary series with               .
R = 0.5
R=0.5R = 0.5

Coefficient     (cont.)

R
RR
X_t = R \times X_{t-1} + Er(t)
Xt=R×Xt1+Er(t)X_t = R \times X_{t-1} + Er(t)
  • Stationary series with               .
R = 0.9
R=0.9R = 0.9

Coefficient     (cont.)

R
RR
X_t = R \times X_{t-1} + Er(t)
Xt=R×Xt1+Er(t)X_t = R \times X_{t-1} + Er(t)
  • Stationary series with               .
\Rightarrow E[X_t] = R \cdot E[X_{t-1}]
E[Xt]=RE[Xt1]\Rightarrow E[X_t] = R \cdot E[X_{t-1}]
R = 1.0
R=1.0R = 1.0

Coefficient     (cont.)

R
RR
  • 如果有一個訊號     對於所有    都滿足以下條件,則我們稱此為一個平穩過程。


               的聯合機率分布 (joint distribution),只和
       
    和    的時間差有關,與其他參數都無關

p(X_{t_0 + k}, {t_0 + k}, X_{t + k}, {t + k}) = p(X_{t_0}, t_0, X_t, t)
p(Xt0+k,t0+k,Xt+k,t+k)=p(Xt0,t0,Xt,t)p(X_{t_0 + k}, {t_0 + k}, X_{t + k}, {t + k}) = p(X_{t_0}, t_0, X_t, t)
X_t
XtX_t
X_{t_0}
Xt0X_{t_0}
t_0
t0t_0
t
tt
X
XX
k
kk

Definition

  • 若為一個平穩隨機過程,則需滿足以下條件:


    • 機率密度函數 (PDF) 在任意時間點    都是相同
    • 與時間無關 (time-independent) 的函式。
p(X_{t + k}, {t + k}) = p(X_t, t)
p(Xt+k,t+k)=p(Xt,t)p(X_{t + k}, {t + k}) = p(X_t, t)
t
tt

Definition (cont.)

  • 白雜訊 (white noise)

    • 功率密度為常數隨機過程

    • 一個時間連續隨機過程      其中   為實數是一個白雜訊若且唯若滿足以下條件:

\mu_x = E[x(t)] = 0
μx=E[x(t)]=0\mu_x = E[x(t)] = 0
R_{xx}(t_0, t) = E[x(t_0)x(t)] = \dfrac{N_0}{2} \delta(t_0 - t)
Rxx(t0,t)=E[x(t0)x(t)]=N02δ(t0t)R_{xx}(t_0, t) = E[x(t_0)x(t)] = \dfrac{N_0}{2} \delta(t_0 - t)
\Rightarrow S_{xx} = \dfrac{N_0}{2}
Sxx=N02\Rightarrow S_{xx} = \dfrac{N_0}{2}

功率密度

t
tt
X_t
XtX_t

Example

References

References

Made with Slides.com