Welcome back to practical time series analysis.

We've studied moving average models and we've studied auto regressive models.

We know how to estimate coefficients, draw inferences,

for instance with the ACF,

for the PACF, we've done a bit with them.

At this point we bring them together into a mixed model the so-called ARMA model.

We'll see how to describe physical processes with these models,

how to do simulations,

and also for theoretical reasons and

analytical reasons how to take a mixed model and pull it

back to either an AR model or a MA model.

So, after this lecture,

you'll be able to build more useful models of your time series.

Will learn how to do simulations and we'll do some of

these technical swaps between the mix in the AR in the MA models.

As I said, it's a bit more mathy in this lecture,

a little bit more theoretical,

but if you hang in there,

will understand how to do these things in some simple cases.

Here's our basic formula.

If you want to bring together a moving average in an auto regressive part,

you can think of it as your time series,

the element at the Tth position is going to be built on some noise, as usual,

plus an AutoReggressivePart where we bring in terms of

the time series and take a weighted average of

previous terms plus the moving average part,

where our time series is subject to noise at each one of its stages and we will build

our new value out here as a weighted average of noise at the previous locations.

We found that using backwards difference operators and building polynomials,

in fact from them,

was a very useful thing to do.

Not only does this help us to express our equations rather succinctly,

rather compactly but also we develop an algebra with these operators

that allow us to come up with some rather simple theoretical results.

So, for instance, if we look at the polynomial operator here,

the backwards difference operator operating on X of T,

then we will recover X of T because of the one.

We will subtract off a constant times

the difference operator right here bringing us back to X of T minus one.

This would bring us back to X of T minus two,

all the way back through X of T minus P and

analogously with the backwards difference on the moving average.

When we present the ARMA with the mixed process,

we'll take this as a operator polynomial acting on a noise term as equal

to a difference polynomial acting on our values.

So, a quick goal here will be,

if you have a mixed process,

can you bring it back to an infinite order moving average process?

Because we have results of moving average processes, for instance,

we can express the auto correlation function of a moving average process.

The operator notation makes this really quite simple.

Treat these as polynomials and so we have the

polynomial acting on noise equal to the polynomial acting on series values.

If you would like to obtain X of T as a moving average,

then you can divide both sides by phi of T or phi of B rather,

but then the interesting question is,

how in the world do you handle a term like that?

How do you take a difference of these two polynomials?

Same thing would be true if you have a mixed process and you want to

express it as an infinite order Autoregressive process.

The method is going to again be to work with these polynomials.

There is a lot going on here,

but the notation really makes it quite simple and elegant to work with.

We'll say that noise at time T looks like this operator acting on X of T

and so now we have an infinite order perhaps infinite order autoregressive process.

Again, how are we going to deal with that ratio of polynomials?

I think the best way to get at this is with an example,

I have X of T over here on the left looking like a 0.7 times the state of the series one

time ago plus noise at the current time and then 0.2 times noise one period ago.

So I've got moving average part and autoregressive part.

I'd like to simulate that,

so I'll set a seed and I'll use the routine arima.sim.

This is going to allow us to get both the auto regressive and the moving average part.

I need to tell it the order of the process and I also need to feed in these coefficients.

So I'll pop in 0.7 and a 0.2 here for the ARP's and the MAP's.

I'm taking quite a long series so that when we do our estimation they'll be spot on.

So a little bit of code,

we'll do the usual sort of plotting,

will plot the time series or at least the first 400 or so terms of the series,

will look at the ACF and we'll look at the PACF.

When we do this, you can see that there is

structure there this is certainly not just noise.

The auto-correlation function trends down here consistent with well,

either a trend or a moving average process.

The PACF seems to have three pieces right here like this before we get down it's a noise.

Let's work with these polynomials.

If you bring the X of T terms over and to the left in the noise terms and to the right,

we wind up with this simple polynomial expression

and here we're just defining our notation again.