Conditional probability & Bayes rule

 --How does one compute the conditional probability? First, consider the case of equally likely

outcomes. In view of the new information, occurrence of the condition B, only the outcomes

contained in B still have a non-zero chance to occur. Counting only such outcomes, the

unconditional probability of A,

P {A} =  number of outcomes in A

              -------------------------------------

               number of outcomes in Ω,

is now replaced by the conditional probability of A given B,


P {A | B} =number of outcomes in A ∩ B

                   ----------------------------------------

                  number of outcomes in B

=P {A ∩ B}

------------------

  P {B}

This appears to be the general formula.

Conditional

probability P {A | B} =  P {A ∩ B}

                                      ---------------

                                          P {B}

Intersection,

general case P {A ∩ B} = P {B} P {A | B}


Example . Ninety percent of flights depart on time. Eighty percent of flights arrive

on time. Seventy-five percent of flights depart on time and arrive on time.

(a) You are meeting a flight that departed on time. What is the probability that it will

arrive on time?

(b) You have met a flight, and it arrived on time. What is the probability that it departed

on time?

(c) Are the events, departing on time and arriving on time, independent?

Solution. Denote the events,

A = {arriving on time} ,

D = {departing on time} .

We have:

P {A} = 0.8, P {D} = 0.9, P {A ∩ D} = 0.75.

Actually, any one of these inequalities is sufficient to prove that A and D are dependent.

Further, we see that P {A | D} > P {A} and P {D | A} > P {D}. In other words, de-
parting on time increases the probability of arriving on time, and vice versa. This perfectly

agrees with our intuition.



Bayes Rule
The last example shows that two conditional probabilities, P {A | B} and P {B | A}, are
not the same, in general. Consider another example.
Example 2.32 (Reliability of a test). There exists a test for a certain viral infection
(including a virus attack on a computer network). It is 95% reliable for infected patients
and 99% reliable for the healthy ones. That is, if a patient has the virus (event V ), the test
shows that (event S) with probability P {S | V } = 0.95, and if the patient does not have
the virus, the test shows that with probability P

S | V
= 0.99.

Consider a patient whose test result is positive (i.e., the test shows that the patient has the
virus). Knowing that sometimes the test is wrong, naturally, the patient is eager to know
the probability that he or she indeed has the virus. However, this conditional probability,
P {V | S}, is not stated among the given characteristics of this test. ♦

This example is applicable to any testing procedure including software and hardware tests,
pregnancy tests, paternity tests, alcohol tests, academic exams, etc. The problem is to
connect the given P {S | V } and the quantity in question, P {V | S}. This was done in the
eighteenth century by English minister Thomas Bayes (1702–1761) in the following way.
Notice that A ∩ B = B ∩ A. Therefore, using (2.8), P {B} P {A | B} = P {A} P {B | A} .
Solve for P {B | A} to obtain


Example  (Situation on a midterm exam). On a midterm exam, students X, Y ,
and Z forgot to sign their papers. Professor knows that they can write a good exam with
probabilities 0.8, 0.7, and 0.5, respectively. After the grading, he notices that two unsigned
exams are good and one is bad. Given this information, and assuming that students worked
independently of each other, what is the probability that the bad exam belongs to student
Z?
Solution. Denote good and bad exams by G and B. Also, let GGB denote two good and
one bad exams, XG denote the event “student X wrote a good exam,” etc. We need to find
P {ZB | GGB} given that P {G | X} = 0.8, P {G | Y } = 0.7, and P {G | Z} = 0.5.
By the Bayes Rule,

Given ZB, event GGB occurs only when both X and Y write good exams. Thus,
P {GGB | ZB} = (0.8)(0.7).
Event GGB consists of three outcomes depending on the student who wrote the bad exam.

Adding their probabilities, we get
P {GGB}
= P {XG ∩ Y G ∩ ZB} + P {XG ∩ Y B ∩ ZG} + P {XB ∩ Y G ∩ ZG}
= (0.8)(0.7)(0.5) + (0.8)(0.3)(0.5) + (0.2)(0.7)(0.5) = 0.47.
Then

P {ZB | GGB} =

(0.8)(0.7)(0.5)
0.47
= 0.5957.


In the Bayes Rule, the denominator is often computed by the Law of Total Probability.








Post a Comment

Previous Post Next Post