Complementary Independence

I was thinking through a probability situation the other day, and I noticed I was making an unspoken assumption, namely that if two events A and B are independent, then their complements \overline{A} and  \overline{B} are independent.

At first I thought that this probably wasn’t true, because I’ve never seen that particular result explicitly stated in any probability or statistics book I’ve ever read (and I’ve read a few over the years), and because of the fact that human intuition for probability is notoriously unreliable (e.g. even Paul Erdos got the Monty Hall problem wrong when he first heard it).

Well, it turns out to be correct. In fact, a stronger result is true, namely that if A and B are independent, so are A and \overline{B} (which of course immediately implies that if any one of the pairs of events (A,B), (\overline{A},B), (A,\overline{B}), (\overline{A},\overline{B}) are independent, then so are all of the others). The proof is elementary, using the definition that C and D are independent means that P(C \cap D) = P(C)P(D).

To be more formal about it.

Theorem: if A and B are independent, then so are A and \overline{B}
Proof: We will show that P(A \cap B) = P(A)P(B) implies P(A \cap \overline{B}) = P(A)P(\overline{B}).
A is the disjoint union of A \cap B and A \cap \overline{B}, so that
P(A) = P(A \cap B) + P(A \cap \overline{B}), and by using the independence of A and B we have
P(A) = P(A)P(B) + P(A \cap \overline{B}), and so
P(A \cap \overline{B}) = P(A)-P(A)P(B) = P(A)(1-P(B)) = P(A)P(\overline{B}) i.e.
P(A \cap \overline{B}) = P(A)P(\overline{B}) Q.E.D.

Theorem: if A and B are independent, then so are \overline{A} and \overline{B}
Proof: apply the above theorem twice in a row.

I’ve no idea why these results aren’t explicitly mentioned in textbooks.

And it does make me wonder how many similarly fundamental basic results (in all fields of mathematics) are not being explicitly pointed out in textbooks and courses.

P.S. Just in case you want to see a more direct self contained proof for the \overline{A} and \overline{B} independence, here it is:
P(\overline{A} \cap \overline{B})
= P(\overline{A}) - P(\overline{A} \cap B)
= (1- P(A)) - (P(B) - P(A \cap B))
= 1- P(A) - P(B) + P(A \cap B))
= 1- P(A) - P(B) + P(A)P(B)) by the independence assumption
= (1-P(A))(1-P(B))
= P(\overline{A}) P(\overline{B})

Author: Walter Vannini

Hi, I'm Walter Vannini. I'm a computer programmer and I'm based in the San Francisco Bay Area. Before I wrote software, I was a mathematics professor. I think about math, computer science, and related fields all the time, and this blog is one of my outlets. I can be reached via walterv at gbbservices dot com.