I was thinking through a probability situation the other day, and I noticed I was making an unspoken assumption, namely that if two events and are independent, then their complements and are independent.
At first I thought that this probably wasn’t true, because I’ve never seen that particular result explicitly stated in any probability or statistics book I’ve ever read (and I’ve read a few over the years), and because of the fact that human intuition for probability is notoriously unreliable (e.g. even Paul Erdos got the Monty Hall problem wrong when he first heard it).
Well, it turns out to be correct. In fact, a stronger result is true, namely that if and are independent, so are and (which of course immediately implies that if any one of the pairs of events are independent, then so are all of the others). The proof is elementary, using the definition that and are independent means that .
To be more formal about it.
Theorem: if and are independent, then so are and
Proof: We will show that implies .
is the disjoint union of and , so that
, and by using the independence of and we have
, and so
i.e.
Q.E.D.
Theorem: if and are independent, then so are and
Proof: apply the above theorem twice in a row.
I’ve no idea why these results aren’t explicitly mentioned in textbooks.
And it does make me wonder how many similarly fundamental basic results (in all fields of mathematics) are not being explicitly pointed out in textbooks and courses.
P.S. Just in case you want to see a more direct self contained proof for the and independence, here it is:
by the independence assumption