>> We are now going to discuss the multivariate normal distribution.

The multivariate normal distribution is a very important distribution in finance.

It crops up in many different applications including, for example, mean variance

analysis and asset allocation, as well as geometric Brownian motion and the

Black-Scholes[UNKNOWN]. So we say an n-dimensional vector, X, is

multivariate normal with mean vector Mu and covariance matrix Sigma; if the PDF of

X is given to us by this quantity here. Okay, so the PDF is equal to 1 over 2 pi

to the power of n over 2, times the terminant of the covarience matrix raised

to the power of a half, times the exponential of this quantity up here.

And be right that X is multivariate normal Mu, sigma.

The little subscript n here, denotes the dimensionality of the vector x.

The standard multivariate normal, has mean vector mu equal to 0, and variance

covariance matrix equal to the n by n identity matrix.

And in this case, the xi's are independent.

We can actually see that, because in this case we can write, the joint PD f of x, as

being equal to the product. I equals one to in.

One over route to pie e to the minus a half x i squared.

And that follows just from this line here because mu equals zero so this term

disappears, and Sigma is just the identity.

So, in fact, you just end up with a sum of xi squared divided by 2.

So as we saw in an earlier module on multivariant distributions.

If the joint PDF factorizes into a product of marginal PDF's, then the random

variables are independent. Okay.

The moment generating function of x is given to us by this quantity here.

So phi subscript x of s is actually a function of s.

Okay this vector s. And it's the expected value of e, to the s

transpose x. Okay, and this is equal to e to the s,

transpose mu, plus a half s transpose sigma s.

Now, you're probably familiar with this in the 1 dimensional case, we'll just recover

here. Suppose x is really just a scale of random

variable, then the moment generating function of x is equal to the expected

value of e to the sx, and it's equal to e to the s mu plus the half sigma squared s

squared. And this is the case where x is normal

with mean mu and variance sigma squared. So this is the moment generating function

of the scalar. Normal random variable.

This is, it's generalization to a multivariate normal random vector, x.

Okay. So, we call our partition we saw in an

earlier module. We can break x into two blocks of vectors

x1 and x2 as such. We can extend this notation, notation

naturally. So we can write Mu equals 1 2, and equals

to This sigma 11, sigma 12, sigma 21, sigma 22 and they are the mean vector and

covariance matrix of x1, x2. So we have the following results on the

marginal conditional distributions of x. The marginal distribution of a

multivariate normal random variable is itself normal.

In particular the marginal. Distribution of Xi is multivariate normal

with mean vector Ui and variance covariance matrix sigma Ii.

So for example X1 is multivariate normal, in fact it's k components, mu 1, sigman 1,

1. And similarly X2 is multivariate normal.

Mu 2, sigma 2, 2, and this is n minus k components.

And we have here an example of the bi-variance normal density function, where

the correlation beween x1 and x2 is 80%. If we rotate the service you can see the

correlation of 80 percent the large values of X 1 are associated with values of x 2

like all values of x 1 are related to all values of x 2.

So we can also talk about the conditional distribution assuming sigma is positive

definite. The conditional distribution of the

multivariate normal distribution is also multivariate normal.

In particular x 2, given that x 1 equals little x 1 is multivariate normal with

mean vector mu 2.1. In the variance, covariance matrix, sigma

2.1. Where mu 2 1, is given to us by this

expression here, and sigma 2.1 is given to us by this expression here.

And we can get some intuition for this result, by just imaging the following

situation; so we've got X one down here. We have X two over here, and imagine we

plot some points from X one and X two if you like, we generate X one and X two from

some distribution, from the bivariate normal distribution, in particular.

So the mean of X one is, let's say mew one and the mean of X two is mew two.

Okay. Now what if I tell you that we observe

that X 1 was equal to this little value X 1.

Well if that's the case, then you can come up here and you'll see that X 2 is more

likely than not to be in this region as. I'll circle them right here.

So in fact you would expect the conventional mean x one equals little x

one to be maybe somewhere around here . And this would be near 2.1 okay?

Likewise you can see just from this. Again, that the variance of x2 would have

shrunk. Because knowing something about x1 would

give us information about x2, and that would decrease our uncertainty about the

location of x2. And in fact this expression here tells us

how to actually do that. This mathematically.

So, so they're conditional distributions. A conditional distribution of a

multivariate normal is again, multivariate normal.

We also mention that the linear combination, ax plus a, of multivariate

normal random variable x, is normally distributed with mean, a times the

expected value of x plus little a, and covariance mix, matrix a times covariance

of x times A transpose.