0:00

We're now going to review the concepts of conditional expectation and conditional

Â variances. We'll see that conditional expectations

Â identity as well as the conditional variance identity, and we'll see an

Â example where we put them to work. This material is useful because we will

Â use it later in the course when we discuss credit derivitives.

Â When x and y be two random variables, then the conditional expectation identity

Â states that the expected value of x can be calculated as follows.

Â We first compute the expected value of x conditional on y and then we actually

Â compute the expected value of that quantity.

Â Likewise, the conditional variance identity states that we can compute the

Â variance of x as the summation of two quantities.

Â First of all, we can compute the expected value of x given y, and then compute the

Â variance of this quantity. And then, also compute the variance of the

Â expected value, sorry, then compute the variance of x given y and compute the

Â expected value of this quantity. Okay.

Â One thing I want to emphasize here is that the expected value of x given y, and the

Â variance of x given y, are both functions of y, and are there for random variables

Â themselves. So for example, I actually could write

Â this as g of y, say, so this is also a g of y, and maybe the variance of x given y,

Â I could write as h of y. So g of y, and h of y, are random

Â variables. So in fact, I can write the expected value

Â of x, as being the expected value of g of y, okay.

Â And, I can write the variance, of x, as then being equal to the variance, of the

Â random variable g of y. Plus the expected value of the random

Â variable h of y. Okay so there the conditional expectation

Â and conditional variance identities and they can be very useful in many

Â applications. So we'll see one application here.

Â So, we want to compute a random sum of random variables.

Â In particular we're going to that w equal to x1 plus x2 and so on up to xn where the

Â xi's are IID with mean ux and variance sigma x squared.

Â But where n is also a random variable and this random variable is assumed to be

Â independent of the variable xi's. So the question that arises is the

Â following. What is the expected value w?

Â Well I can compute the expected value of w using the conditional expectation

Â identity. In particular the expected value of w.

Â Well, over here I can write that as being equal to the expected value of the

Â expected value of w given n. And this quantity here inside here is w.

Â Okay. So the expected value of w, given n, if

Â you think about it, the expected value of w, given n, is equal to the expected value

Â of this summation, i equals 1 to n of the xi's.

Â And because n is a constant, given n, I should have an n here, okay.

Â This is equal to the sum, n, i equals 1, of the expected value of the xi's, and

Â this is equal to, well this is equal to mu x.

Â And there's n terms, so that's n mu x. And that's where this comes from here.

Â Okay. So now the mu x is a constant.

Â We can take it outside the outer expectation over here, and we're left with

Â the expected value of N. So that's how we compute the expected

Â value of w. How about the variance of w.

Â Well, we can compute the variance of w by using the conditional variance identity.

Â So this is the variance identity here. We've already calculated the expected

Â value of w given n. It's equal to n times mu x, so that's what

Â this quantity is down here. The variance of w given n, well that's the

Â variance of this quantity given n. These are n IID rounding variables and the

Â variance of n IID rounding variables is simply n times the variance of one of

Â them, which is sigma x squared. And so we get n times sigma x squared

Â here. So now, the variance of mu x times n, well

Â mu x is a constant, so it comes out the variance, outside the variance is a

Â square. And we're left with mu x squared times the

Â variance of n. And over here, sigma x squared is a

Â constant, and it comes outside the expectation, and we're left the expected

Â value of n. So that's how we compute the variance of

Â w. So here's an example with chickens and

Â eggs. A hen lays n eggs where n is plus on with

Â parameter lambda. Each egg hatches and yields a chicken with

Â probability p independently of the other eggs and then.

Â Let k be the number of chickens. So the first question I want to ask is,

Â what is the expected value of k given n? And of course one of the reasons I want to

Â ask this question is because I want to introduce indicator functions, which are

Â often very useful in probability, and in fact we'll use them later in the course.

Â We'll be using indicator functions later in the course to describe the event of

Â companies defaulting on their bonds. So we'll use it to compute the expected

Â number of defaults in the basket of bonds for example.

Â So it's a good place right now, we could add here right now to introduce these

Â indicator functions. So what we're going to do is we're going

Â to write the total number of chickens, k, as being the sum from I equals 1 to

Â capital N times one subscript hi where hi is the event that the ith egg hatches.

Â Okay. So in particular, one subscript to hi.

Â This is the indicator function, and it takes on 2 possible values.

Â It takes on the value 1 if the ith egg hatches.

Â And it takes on the value 0 otherwise. So in fact that's an indicator function in

Â general it takes on two values one and zero.

Â One if the event in question occurs zero otherwise.

Â In this particular example the event in question is hi which is the event that the

Â ith egg hatches. Okay so we've written k as the sum from i

Â equals 1 to n of these indicator functions.

Â It's also clear that the expected value of one of these indicator functions is easily

Â computed. In particular, it takes on the value 1

Â with probability p. It takes on the value 0 with probability 1

Â minus p, and so this is equal to p. So the expected value of the indicator

Â function 1 hi is equal to p. So now the expected value of k given n is

Â the expected value of this quantity which is k given n and n is a constant at this

Â point because of conditions on its value. So, we can just take the expectation

Â inside the summation and get this, but we know the expected value of 1hi is equal to

Â p. There's n of these terms, so we get np.

Â So therefore the expected value of k given n equals n times p.

Â Which of course is what you'd expect if you've got n eggs and each of them occurs

Â with probability p. You would expect the total number of

Â chickens to be n times p. Okay.

Â So now, we can use the conditional expectation form to compute the expected

Â value of k. The expected value of k is equal to the

Â expected value of the expected value of k, given n.

Â But that, we have calculated there. It's np.

Â So now, the expected value of k is the expected value of np.

Â P is a constant. So it can come outside over here.

Â And the expected value of n? Well n is Poisson we're told and if we

Â recall, the expected value of a Poisson random variable is equal to lamda and so

Â that's why we get the lamda down here. And so we see that the expected number of

Â chickens is equal to lambda times p.

Â