0:00

In this lecture, we will take about invertibility of stochastic processes.

It's going to be Introduction to Invertibility.

So objective is to learn invertibility of a stochastic process.

So let's look at these Two MA(1), Moving Average of all through one models.

Model 1 is Xt = Zt + 2Zt-1,

Model 2 will have instead of 2 will have half of Zt-1.

So what we want to do first,

we'd like to point out the covariance function of model one.

0:33

What is the definition of auto covariance function?

Gamma (k) is basically covariance, with an Xt plus k and Xt, right?

So, basically the gamma (k) depends on the difference between the steps,

not Xt, not Xt + k.

But k, the number of the lags in between those two random variables.

And we put the definition of Xt + k into it for model 1 and xt and three

which ez plus 2 zt minus 1 and then we try to use the covariance, we expand this.

Now the recipe takes the covariance of zt plus k with the zt,

zt plus k with zt minus 1.

This term, with zt and the same term with 2 zt minus 1.

Now realize the full level, if k is greater than 1,

which would mean that t plus k minus 1 is greater than t, in other words,

these guys are actually uncorrelated.

There is no covary that correlates the 0 between them.

In other words, if k is greater than 1, our gamma k is 0.

This is Theoretical Auto Covariance Function.

Now if k is 0, exactly 0, that would mean we are looking at gamma 0, right.

And gamma 0 is basically the variance, the variance of this expression.

And we can see that this is basically covariance Zt Zt,

and covariance Zt Z2- 1 is going to be 0 because they are uncorrelated.

Zt- 1, Zt will be 0, but that Zt- 1 will have another variance here.

The first Covariant will be most Gamma.

I'm sorry Sigma squared and

this one another Sigma squared and then we'll get five Sigma squared.

The Sigma is standard deviation of the disturbance for

k = 1, we write the definition of xt + 1,

which is Zt+1 + 2Zt, we expand it,

there's only correlated guys are 2Zt and Zt, and that will give us 2 sigma square.

If k is negative of course, gamma k,

this is the covariance, it's even function of K, so

gamma K is same thing as gamma negative K .So if i write auto covariance function.

Auto covariance function looks like this starting problem from lag two so

great than lag one.

Starting with Lag 2.

We have n covariance, no auto covariance.

K1, in other words, the conveyors would like one and two sigmas squared.

And then the covariance with itself is five sigma squared.

And then for negative one, we just the symmetric.

This is gamma negative K.

And if I want to find out the correlation fun, function, in other words, A,

C, F, all I have to do is basically divide.

Gamma (k) to gamma (0), that's the definition of auto correlation function.

And the gamma (k) is basically, I divided everything by 5 sigma squared.

This becomes 0.

This becomes 2 / 5, 1 and this is auto correlation for -k.

So this is our auto correlation function.

So if I graph this we have autocorrelation always at lag 0 which is self

3:43

the XT has a correlation 1 with itself, of course.

And then we have some autocorrelation in lag 1, then it.

Right? This is basically a famous picture for

An ACF of an MA model.

If you have MA1 model, then ACF has to cut off starting from one on.

Okay, now if I look at all the correlation part of model 2, so

I'm going to skip a few steps here.

If we are concentrated on rho 1,

the correlation with other correlation with lag one.

This is gamma 1 over gamma 0.

This is all gamma 1, this is gamma 0.

instead of 2s now we have halves.

If we calculate this, it becomes 2 over 5.

So if you write ACF of the model 2, we get exactly the same ACF of model 1.

So if we have now two different models Model 1, Model 2, they only differ with

this coefficient 2 and a half, but both of them has an ACF which is exactly the same.

Now, if you are trying to model a time series, and

if you believe that ACF [INAUDIBLE] at some point,

then you try to model it with MAQ models, right, MA 1, MA 2.

But, if ACF, a few more and

make queue process would have the same ACF that would be problematic to model.

So we would like to somehow eliminate one of these models in some way.

So ACFs are same, there's exactly the same ACF, Model 1 and Model 2.

Okay.

So let's pause that thought a little bit.

So we're going to come back and eliminate one of those models.

But how you're going to eliminate.

So let's talk about this invertibility, inverting.

5:22

So what are we going to do, we going to take general MA1 process now

that we have Xt which is equal to Zt plus some beta and

our models beta was either half or two, okay.

What we're going to do.

We're going to find Zt from here, so Zt is going to basically Xt- beta Z- 1.

All right, but then instead of Zt- 1,

we will use the same relation, this precursor relation.

Instead of Zt- 1, if we use this.

5:50

Put t instead of t minus.

Instead of t put t-1, this is going to be Xt-1, beta Zt-2, and

replace Zt-1 with that expression.

And expand it.

We get Xt minus beta Xt-1, plus beta squared Zt-2.

In other words, Zt now is expressed as Xt and

Xt-1, plus some In step 2.

And then we can continue.

Instead of zt2.

We can put again, xt minus 2 minus beta.

Xz t minus 3, and so forth.

We can continue with this, in theory, until infinity.

Then we'll, sorry.

So we can continue this until infinity.

What do we obtain?

We obtain Zt is equal to Xt- Bxt- 1 + B squared xt- 2- B cubed xt- 3.

And it goes on and on and on.

6:43

Right? So what does this mean?

Now the rest if you find xt here, xt = zt

+ linear combination, but this is series, it's not a linear combination anymore.

Infinite series, and xt is on every exterior value in the past, right.

So you can think of this AR processes are the regressive process.

But the order, well it doesn't matter, the order is infinity, this is AR infinity.

Process.

So what did we do?

Well, we kind of inverted MA(1) process into other with order infinity.

Here's the one thing you have to keep in mind, this is a series here,

there's an infinite series here.

And whenever we see an infinite series,

we have to Keep in mind that it might be convergent or divergent,you have to make

sure that is actually convergent in some sense.

Okay, now I'm going repeat the same thing using Backward shift operator,

let's utilize Backward shift operator.

For Ma1 process, Xt can be written as Beta(B)Zt, and where is my Beta(B)?

Beta(B) is basically 1 + Beta B.

That's our polynomial operator acting on Zt, and that will give us Ma1 process.

8:02

Now, I would like to somehow write Zt in terms of Xt.

So what am I going to do?

I'm going to find inverse operator So

if the Beta B is our polynomial operator, somehow if I can find inverse operator,

and put in the inverse operator into the left-hand side,

then Zt is expressed some inverse operator, acting on Xt.

In other words, Zt will be expressed in terms of Xts.

8:30

Okay.

So what is a Beta B inverse?

Well, Beta B is 1 plus Beta B.

And we can write this as 1 over 1 plus Beta B.

This is not fraction.

This is basically inverse operator of 1 plus Beta B.

And what we're going to do is the following.

We're going to act like this Beta B Is a complex number for a minute.

Assume that it's not actually an operator.

It's a complex number.

Think of z, and then expand this.

This is the usual geometric expansion.

This is geometric series.

1 minus r plus r squared minus, I'm sorry.

1 R, R is negative beta B.

So it's going to be 1 plus R plus R square plus R cubed but

then since I have 1 minus negative beta B here sign will alternate.

So this is our inverse operator and

if we act this operator this Z inverse operator on our X,

T then we will obtain well one Well, this is not really one.

This is Xt, so this is a typo here.

This operator is acting on Xt, 1 acting on Xt.

The Xt, and then this is going to give us Xt minus 1 and so forth.

In either way we invert that Ma1 process into ar infinity.

And we express Zt as infinite sum of Xts with some weight in front of them, right?

So how do we make sure that this series is actually convergent series?

10:03

Well it turns out That that series is convergent in some sense,

this is random variables,

adding them up, so we have to talk about convergence of random variables.

This is not usual sequence, series.

It is random variables involved here.

There's this notion called mean-square convergence Which is an option that

medial will be there about this mean square root is.

But we actually proved this resolved.

But for now let's use this as a black box.

This series is convergent.

It means squared tens.

If and only if absolute value of beta,

absolute value of beta is less than Okay let's get a definition of invertibility.

In general, it make t as a stochastic process.

It doesn't have to be [INAUDIBLE].

Xt is a stochastic process and think of Z is a notation right?

It's random disperse of white noise [INAUDIBLE].

They say Xt is invertible.

If Zt can be expressed as infinite series.

11:11

Pie K, Xt minus k.

K goes from zero to infinity.

This is like AR infinity process,

where This pk is the sum of the absolute value of pk's is convergent.

Now there was a series just of pk's are absolutely convergent.

This is the definition of invertibility of a stochastic process.

Now in our model one, let's go back to our model one, model two.

Remember those are the two models we have.

It's tame ACF who wants to eliminate one of them.

If you look at model one our pi k going to be since beta is 2,

it's going to be sum of 2 the k.

And this series really diverges here.

But if you look at Model 2, the [INAUDIBLE] case is 1 over 2.

If you look at some of the 1 over 2 to the k, we talked about this,

this is a geometric series, and it is a convergence series.

So what are we going to do?

We're going to eliminate Model 1 and go with Model 2.

Somehow we will assume that we would like to

have our model to have something worth splitting.