0:05

Welcome back to Intuitive Introduction to Probability.

Â In the last lecture,

Â I gave you some intuitive definitions of how we can define probabilities.

Â There was the classical definition, the empirical probability definition, and

Â the subjective probability definition.

Â In this lecture, I now want to be a little more precise, and

Â give correct definitions of what a probability exactly is.

Â And we will return to those three definitions.

Â 0:37

So, we need a little language to describe the probability setting.

Â There's first, a random experiment.

Â Sounds kind of strange, sounds like we're in a lab.

Â But in probability theory,

Â anything that has an uncertain outcome is called a random experiment.

Â That could be rolling a die, that could be pulling a card, this is a roulette table.

Â But it's also in the real world.

Â The weather next Monday, or if you think about the stock price of

Â your favorite company, the stock price next Tuesday.

Â We don't know those.

Â Those are uncertain outcomes, and therefore,

Â in our language, those are random experiments.

Â 1:19

Now, what can happen in a random experiment?

Â Those are called basic outcomes.

Â In a die it's very easy.

Â One, two, three, four, five, six are the basic outcomes.

Â If you think about the exact weather or

Â the exact stock price, this is already a little bit more tricky.

Â And we will see examples as we go along.

Â So collection of all basic outcomes are called the sample space.

Â And I apologize, we need a little notation.

Â Here's our first notation in this class.

Â S for the sample space.

Â Now, when want to define probabilities, we need subsets,

Â some elements, perhaps not all of them of a sample space.

Â And those are called an event and again we need a little notation,

Â those are typically denoted by capital letters A, B, C and so on.

Â 2:10

Now, having said this, we can define probability.

Â That's the chance or the likelihood that an uncertain event will occur.

Â That's a number that's between 0 and 1.

Â Some people prefer to use percentages, 0% to 100%.

Â Now here be careful.

Â A probability cannot be larger than 100%.

Â It cannot be 120%.

Â It can also not be negative, minus 10%.

Â Sometimes people confuse growth rates, which can be negative or

Â larger than 100%, with probabilities.

Â So please be careful.

Â Probabilities are numbers between 0 and 1, 0% and 100%.

Â And now, we can give precise definitions for the three probability concepts.

Â The classical probability concept rests on an important assumption,

Â all basic outcomes are equally likely.

Â So, for example, in a fair die that's given.

Â Every number, one, two, three, four, five, six is equally likely.

Â And then the probability of an event is the number of elements in your event

Â divided by the total number of basic outcomes in the sample space.

Â We'll see examples soon.

Â Now, as I say, in addition to a fair die,

Â that is true in a lottery, at a roulette table and so on.

Â Now, this key assumption that all the basic outcomes have to be equally

Â likely is often not satisfied when we talk about real world applications.

Â And therefore this beautiful definition,

Â while helpful when you play a card game with your buddies or

Â a dice game with some kids, is essentially useless for real-world applications.

Â That's why we need more definitions.

Â 4:07

If you have data and you can derive proportions from historical data,

Â that's when we get to the empirical probability definition,

Â also called the relative frequency probability.

Â Here the idea is now that we have repeated trials of an experiment.

Â For example, I may look at the weather on a particular day for

Â the last 20 years and say, can I use some average there to make a prediction.

Â What's the probability it will rain, what's the probability it will be sunny?

Â Or if I think of stock prices in the stock market,

Â people in finance love to look at the last 250 trading days, and

Â say how many days did the stock go up, how many days did it go down?

Â And based on that, get empirical definitions.

Â So it's, the definition says how often did an event occur in

Â a series of trials divided by the number of trials.

Â And that's now the empirical probability definition also used in medicine and

Â the pharmaceutical industry, whenever we do drug testing.

Â 5:16

Sometimes, things get even worse.

Â You don't have data, you have no clue, and

Â at that point we go to the subjective probability definition.

Â Very important, for example, if you have a new product development,

Â you bring out a brand new product, a really disruptive innovation.

Â You have no idea whether your customers like it or not.

Â Maybe you have some, what managers like to call gut feeling or experience.

Â And based on that experience,

Â you say, I think there's a 75% chance this will be a successful product.

Â 5:51

And so in that case we talk about subjective probabilities.

Â Very important in managerial decision making and everyday decision making.

Â Now, basic probabilities have to satisfy some rules.

Â Now I have to give you three rules.

Â Looks a little mathematical, there's nothing to be understood here.

Â These are also called axioms.

Â Or, after a Russian mathematician who was the first to write these down in 1933,

Â also called Kolmogorov's Axioms.

Â 6:34

Rule number one,

Â the probability of any outcome in the sample space P(S) must be 1.

Â Something must happen when we do our random experiment.

Â P(S) = 1.

Â Second rule, very intuitive, any probability is a number between 0 and 1.

Â Can't be larger than 1, cannot be negative.

Â Please keep that in mind.

Â And finally, the third, a little more complicated rule,

Â if you have two events that have no elements in common,

Â also called disjoint by mathematicians, then the probability that A or

Â B happens, or the probability that A union B happens,

Â equals the sum of the individual probabilities.

Â P(A) + P(B).

Â If that looks a little tricky already to you, let's look at an example and

Â let's go back to fair die.

Â 7:25

What's the random experiment?

Â I'm rolling a fair die.

Â I don't know what's going to happen.

Â That's now my experiment.

Â What are the possible outcomes?

Â One, two, three, four, five, six and they together build the sample space.

Â 7:38

Now I told you about events.

Â Events are subsets of S, and here I define,

Â I pick an event A, the even numbers 2, 4, 6.

Â And the event B, 1 and 5.

Â And now let's look at A and B and their probabilities and see how this works.

Â 8:10

Since we believe it's a fair die, we can use a classical definition.

Â All six numbers are equally likely, and so I'm allowed to just divide.

Â So, clearly P(S), P of any number one through six, is equal to one.

Â Seven cannot happen.

Â Zero cannot happen.

Â Pi cannot happen.

Â Now, let's look at the probabilities of the two events.

Â A has three elements, 2, 4, 6.

Â 3 out of 6 is 1/2 = 0.5 = 50%.

Â B only has two elements.

Â So probability of B?

Â 2/6 = 1/3.

Â 8:53

Now, A union B.

Â Hopefully you remember from your middle school math classes,

Â if I have two sets and I build the union, I take all the elements together.

Â So if I take A, {2, 4, 6}, with the union of {1, 5}, I get {1, 2, 4, 5, 6}.

Â Probability now of A union B, of either A happening or

Â B, is 5 divided by 6, 5 elements divided by 6.

Â Notice that those two events have no elements in common.

Â There's no number that's in A and in B.

Â So, they are disjoint, the intersection is the empty set.

Â And therefore now I can use my probability rule,

Â the probability of A union B is the sum of P(A) and P(B).

Â 3/6 plus 2/6 is 5/6 and

Â guess what, that's exactly the right answer that we saw before.

Â Those are now the, I showed you now the three axioms, the fundamental rules.

Â From those, we can derive further rules,

Â some additional rules that are very, very helpful.

Â First, the complement rule.

Â 10:06

What's a complement of a set?

Â A complement of an event A or a set A are all the elements in S that are not in A.

Â And not surprisingly, the complement rule says the probability that

Â the opposite of A happens is just one minus the probability that A happens.

Â 10:26

And then we have addition rule, the general additional rule that always holds

Â even when A and B are not disjoint, when there is something in the intersection.

Â And then the rule gets a little more complicated.

Â Then the probability of A union B is P(A) + P(B), but

Â then now I need to subtract the probability of the intersection.

Â Because otherwise there would be some double counting.

Â Let's look again at our little example.

Â 10:55

What's the opposite of an even number, 2, 4, 6?

Â You see, odd numbers.

Â The complement are the odd numbers, 1, 3, and 5.

Â The probability of an odd number is 1 minus the probability of the even numbers.

Â 1- 1/2, bingo, is 1/2 again.

Â Now let's look at an event C, that has the elements 1, 2, 3, 4.

Â A union C now is 1, 2, 3, 4, 6.

Â Five numbers, five out of six.

Â So the probability should get 5/6.

Â Now if I use the rule, probability of A union B = P(A) + P(C),

Â and I add A is 3/6, C has a probability of 4/6.

Â I get 7/6.

Â That's not the correct probability because I'm double counting the numbers 2 and 4,

Â which are both in A and in C.

Â And therefore I need to subtract them out.

Â That's why we now have this general rule.

Â And bingo, I get again the right answer of 5/6.

Â To summarize this lecture, I gave you formal definitions of

Â the three probability concepts that we have.

Â Very important, familiarize yourself with them.

Â It's not always the classical probability that we learn as kids

Â as soon as we play a dice game or we play a card game.

Â There are more important definitions for real-world decision making.

Â The empirical probability definition and the subjective probability definition.

Â I showed you the fundamental rules, also called the axioms of probability.

Â And finally two derived rules, which are very helpful in applications,

Â and we will see them in action in the next couple lectures.

Â Thanks for your attention.

Â I look forward to seeing you back in the next lecture.

Â