The joint probability distribution of two random variables is a function describing the probability of pairs of values occurring. For instance, consider a random variable that represents the number of heads in a single coin flip, and a random variable that represents the number of heads in a different single coin flip. Then the joint distribution of and is
The joint distribution of two random variables is given by
where is the probability that given that . When are independent, this value is equal to (as the fact that is irrelevant), and
This also allows the joint distribution to be used as an independence test: if the above relation holds for all , then and are independent. Otherwise, they are not.
Notably, this relation immediately rearranges to Bayes' theorem, which is very important in conditional probability.
Suppose that the probability that a random person has a certain disease is A scientist develops a device which tests a person positive for the disease with % chance when the person really has the disease. However, the same device tests a person positive for the disease with % chance when the person in fact does not have the disease. What is the probability that a person really has the disease when tested positive by the device?
If takes on the value , must take on some value, and so
which is related to the law of iterated expectation.