A **random variable** is a variable whose value can change under different outcomes. For example, the result of flipping a standard 6-sided die is a random variable that takes each of the values from 1 to 6 with probability \(\frac{1}{6}.\)

There are two types of random variables, discrete and continuous. The die-roll example from above is an example of a discrete random variable, since the variable can take on a finite number of discrete values. Choosing a random real number from the interval \([0,1]\) would be an example of a continuous random variable.

A random variable contains a lot of information. We can summarize this information to get an idea of the behaviour of the random variable over the long term. The **expected value** of a random variable is the weighted average of all possible outcomes. For example, if we roll a standard 6-sided die, there are 6 possibilities, each occurring with probability \(\frac{1}{6}\), so the expected value is \(\frac{1}{6}(1) + \frac{1}{6}(2) + \frac{1}{6}(3) + \frac{1}{6}(4) + \frac{1}{6}(5) + \frac{1}{6}(6) = 3.5\). We often denote the expected value of a random variable \(X\) by \(\mu\) or \(E[X].\) More generally, the formula is

\[ E[X] = \begin{cases} \sum_x xP(X=x), & \mbox{ if } X \mbox{ is a discrete random variable} \\ \int_x xf(x) \, dx, & \mbox{ if } X \mbox{ is a continuous random variable}. \end{cases}\]

If we have two random variables \(X\) and \(Y,\) and constants \(a,b,\) then the following properties of expectation hold:

\[\begin{align} E[X + a] & = E[X] + a\\ E[bX] & = bE[X]\\ E[X + Y] & = E[X] + E[Y]\\ \end{align} \]

This is known as **linearity of expectation**, and holds even when \(X\) and \(Y\) are not independent events. Each of these statements follow easily from the definition of expected value, and will be elaborated on in future.

## Worked examples

## 1. There are 2 bags and balls numbered 1 through 5 are placed in them. From each bag, 1 ball is removed. What is the expected value of the total of the two balls?

Consider the following table, which lists the possible values of the first ball in the first row, and the possible values of the second ball in the second row. Each entry on the table is obtained by finding the sum of these two values.

\[ \begin{array} { l | l | l | l | l | l | } & 1 & 2 & 3 & 4 & 5 \\ \hline 1 & 2 & 3 & 4 & 5 & 6\\ \hline 2 & 3 & 4 & 5 & 6 & 7 \\ \hline 3 & 4 & 5 & 6 & 7 & 8 \\ \hline 4 & 5 & 6 & 7 & 8 & 9 \\ \hline 5 & 6 & 7 & 8 & 9 & 10 \\ \hline \end{array} \]

Let \(X\) be the random variable denoting the sum of these values. Then, we can see that the probability distribution of \(X\) is given by

\[\begin{array} { | l | l | l | l | l | l | l | l | l | l l | } \hline x & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 & 10 & \\ \hline P(X=x) & \frac{1}{25} & \frac{2}{25} & \frac{3}{25} & \frac{4}{25} & \frac{5}{25} & \frac{4}{25} & \frac{3}{25} & \frac{2}{25} & \frac{1}{25} & \\ \hline \end{array} \]

As such, this allows us to calculate

\[ \begin{align} E[X] & = 2 \times \frac {1}{25} + 3 \times \frac {2}{25} + 4 \times \frac {3}{25} + 5 \times \frac {4}{25} + 6 \times \frac {5}{25} \\ & \qquad + 7 \times \frac {4}{25} + 8 \times \frac {3}{25} + 9 \times \frac {2}{25} + 10 \times \frac {1}{25} = 6.

\end{align} \](*) How can we use the linearity of expectation to arrive at the result quickly?

## 2. \(n\) six-sided dice are rolled. What is the expected number of times \(5\) is rolled?

To determine the expected number of times 5 is rolled, we can define \(Y\) to be the random variable for the number of times a \(5\) is rolled, and \(Y_i\) to be the random variable for die \(i\) rolling a \(5\). It is easy to see that \(E(Y_i) = \frac{1}{6} \times 1 + \frac{5}{6} \times 0 = \frac{ 1}{6}.\) We have \(Y = \sum\limits_{i=1}^{n} Y_i\), so by the linearity of expectation, \(E(Y) = \sum\limits_{i=1}^{n} E(Y_i)\). so \(E(Y)= \frac{n}{6}\).

Note: We can also answer this question by noting that since the probability of getting each number is equal, the expected number of times we get each number is the same, and the sum of these expectations is \(n\), so for each number the expectation is \(\frac{n}{6}\).

Mathematically speaking, let \( Z_i \) be the random variable for the number of times \(i\) is rolled out of \(n\) throws. By symmetry, we know that \( E[Z_i] \) is a constant. Since there are a total of \(n\) results, hence \( n = Z_1 + Z_2 + Z_3 + Z_4 + Z_5 + Z_6 \). This gives us

\[ n = E[ Z_1 + Z_2 + Z_3 + Z_4 + Z_5 + Z_6 ] = E[Z_1] + E[Z_2]+ E[Z_3]+E[Z_4]+E[Z_5]+E[Z_6] = 6 E[Z_i] \]

## Comments

Sort by:

TopNewestAwesome! :D – Finn Hulse · 3 years ago

Log in to reply

Can you please explain the integral part and continuous variation? I have not studied integral calculus till now, though I've studied derivatives and limits. – Ruhan Habib · 3 years ago

Log in to reply

– Calvin Lin Staff · 3 years ago

You are close to learning about integration. Once you get there, you will see that integration is just the continuous version of summation, which is how these ideas are related.Log in to reply