0:13
In this lecture we will learn about Bayes Rule,
a rather complicated looking formula that has many applications.
But don't worry, you don't actually have to use a complicated formula or
memorize it.
I will show you how we can easily do the calculations in probability tables and
how it's actually rather easy to do the relevant calculations.
Before we look at an ugly formula, let's look at a small,
simple example to introduce some ideas.
1:04
Good parts have probability of 90%, the opposite,
bad parts, have a probability of 10%.
The company looked at some recent data to see is
there any difference between our suppliers, S, T, and U.
And here's data, among the good parts, 60% came from S,
25% from T, and the last 15% from the supplier U.
Among the bad parts, 40% came from the supplier S, and 30% each from T and U.
1:41
And now our manufacturer is asking the following question.
Which of my suppliers delivers the best parts,
the fewest defective parts, the fewest proportion of bad parts?
And which is my worst supplier?
So in the language of probability, it's asking the following question.
What's the probability of a good part given that it comes from supplier S?
What's the probability of a good part it came from T?
What's the probability of a good part it came from U?
We can't answer that question yet because let's look at our data.
Our data set tells us something different.
2:28
We learned probability of good, 0.9, probability of bad, 0.1.
And then the other probabilities are conditional probabilities, but
they're the wrong way.
We have the probability of S given good because we were told,
among the good parts, 60% came from S.
And so on at the bottom of the slide, you see the other numbers.
But what we are interested in is the probability of good given S.
So what should we do?
Let's create our probability table with the data that's given.
Notice we have good and bad parts on the one hand, and
we have three suppliers, S, T, and U.
Good and bad we have given these numbers, 0.9 and 0.1.
So we can fill in the right margin of our little table, and
the sum definitely should be one.
3:25
In the interior, we can now use the general multiplication rule for
dependent events to create the intersection probability.
For example, the probability of a part good and
S Is 0.6 x 0.9, and so on, all the way to
probability of bad and U, 0.3 x 0.1.
So we can fill in the probabilities in the interior,
add up every column, and what do we get?
Voila, there's out complete probability table.
So we see 58% of all tasks are from Supplier S,
25.5% of all tasks are from Supplier T and
the remaining 16 and a half % are from Supplier U.
Now we can calculate the probabilities we really care about, here we go.
Probability of good given S, remember how we do this.
We take the probability from the interior, in our case 0.54,
the joint probability and divide by the margin probability of supplier S,
0.58, and what do we learn?
93.1% of the parts that come from supplier S are good or
more formally, in the language of probability,
probability of good given S is 93.1.
You see all the calculation and what do we learn?
We see that the proportion of good parts is the largest for supplier S.
Put differently, the proportion of bad parts is the smallest for
supplier S and supplier U actually it's over us in this little case.
5:27
We flipped the conditional probabilities, this casual language
of flipping probabilities around is actually very popular among people in
probability theory, and so that's why I use this everyday language.
So we were given the probabilities S given good, U given bad and so on.
And in the end, we calculated the reverse condition of probabilities of
good given S, good given T all the way to bad given U.
6:01
And this now actually there's as a very general approach of what we did in this
little toy example.
There's actually a general rule which does exactly what we just did,
and that is famous based rule.
Let me derive it for you.
Recall the General Multiplication Rule that we saw before for
conditional probabilities, which is intersection probability
equals the conditional times the probability of the condition.
If we use this and
once used event B as a condition, and
once use event A as a condition and then set these two right hand sides equal,
6:44
we see the formula at the bottom of the slide and probe it.
So probability of A given B Equals the probability of
B given A times probability of A divided by probability of B.
Notice on the left we have probability of A given B, and
on the right, we have the probability of B given A so,
if you give me B given A, and the multi-probabilities
I can flip around to condition using this formula.
7:19
Now, we can go a step further and say, that sometimes you may not
have the probability of B, and in that case and that if you would.
It also happened in our little calculation.
We can calculate it by adding up the elements in a probability table here for
completion I gave you there at the formula at the top.
If you don't take this formula and put it into the flipping formula.
We get the formula at the bottom with the yellow background.
And that's the famous base rule for two events.
In our case, we just have had good and bad,
when general language A and A complement.
And what this base rule says, knowing the conditional probability of B given A and
B given the complement of A, we can flip around the conditionals and
calculate the probability of A given B and
B could also calculate the probability of the compliment of A given B.
8:21
There's nothing special about our example.
Good and bad or A and A compliment, this scales up.
Remember, our probability tables can be as large as we want and
the same is true for Bayes Rule.
So if you have no M not just two but
the larger number m of mutual exclusive and totally exhaustive events.
Then here you see the formula of scales up, and
as you see it looks kind Hardly.
And so this formula usually scares my students.
And so I learned over the years to de-emphasize the actual rule and
this nasty looking formula and so teachers via simple examples.
As example I showed you at the beginning, the best way is fill in the probability
table, and then ask yourself, which conditional probability do I really need?
Calculated using the definition of conditional probability, and
that's essentially your applying this complicated looking formula.
9:47
This concludes our module on conditional probabilities.
As I said in the beginning of the module,
this is not an easy concept, I showed you a few example.
In the next module I will focus very much on the concepts of
conditional probabilities, dependence and independence in the context of some
real world problems, I could say real world disasters that happened.
And you will see that these concepts, maybe you still think they're very
abstract, and you think do I really need this in everyday life?
The answer is yes.
Yes, and I will show you some cool applications so please come back for
our next module.
Thank you very much and
enjoy the session with the TA on calculating some probabilities.