# Continuous Random Variables - Definition

**Continuous random variables** describe outcomes in probabilistic situations where the possible values some quantity can take form a continuum, which is often (but not always) the entire set of real numbers $\mathbb{R}$. They are the generalization of discrete random variables to uncountably infinite sets of possible outcomes.

Continuous random variables are essential to models of statistical physics, where the large number of degrees of freedom in systems mean that many physical properties cannot be predicted exactly in advance but can be well-modeled by continuous distributions. In particular, quantum mechanical systems often make use of continuous random variables, since physical properties in these cases might not even have definite values.

## Definition of Continuous Random Variables

Recall that a random variable is a quantity which is drawn from a statistical distribution, i.e. it does not have a fixed value. A **continuous random variable** is a random variable whose statistical distribution is continuous. Formally:

A

continuous random variableis a function $X$ on the outcomes of some probabilistic experiment which takes values in a continuous set $V$.

That is, the possible outcomes lie in a set which is formally (by real-analysis) continuous, which can be understood in the intuitive sense of having no gaps. The fact that $X$ is technically a function can usually be ignored for practical purposes outside of the formal field of measure theory. In applications, $X$ is treated as some quantity which can fluctuate e.g. in repeated experiments, which has statistical properties like mean and variance .

In the next article on continuous probability density functions, the meaning of $X$ will be explored in a more practical setting.

## Which of the following are continuous random variables?

(1) The sum of numbers on a pair of two dice.

(2) The possible sets of outcomes from flipping ten coins.

(3) The possible sets of outcomes from flipping (countably) infinite coins.

(4) The possible values of the temperature outside on any given day.

(5) The possible times that a person arrives at a restaurant.

Solution:

(4) and (5) are the continuous random variables. Going through each case in order:

(1) Ignoring reordering of the dice and repeated values, there are a maximum of 36 possible sets of values on the two dice. In reality, the number is less than this, but would require more careful counting. However, this is sufficent to note that this value is a discrete random variable, since the number of possible values is finite.

(2) Again, the possible sets of outcomes is larger (bounded above by $2^{10}$, certainly) but finite and the same logic applies as in (1).

(3) This case is more interesting because there are infinitely many coins. However, there are only countably many sets of outcomes. A countable set of real numbers is not continuous (consider the countable rational numbers, which are not continuous).

(4) The temperature outside on any given day could be any real number in a given reasonable range. In particular, on no two days is the temperature

exactlythe same number out to infinite decimal places. Thus, the temperature takes values in a continuous set.(5) This case is similar to (4): no two people ever arrive at

exactlythe same time out to infinite precision. The precise time a person arrives is a value in the set of real numbers, which is continuous. Note that this implies that the probability of arriving at any one given time is zero, a fact which will be discussed in the next article.

## Examples of Continuous Random Variables

See uniform random variables, normal distribution, and exponential distribution for more details.

**Uniform Random Variables**

A uniform random variable is one where every value is drawn with equal probability. For instance, a random variable that is uniform on the interval $[0,1]$ is:

$f(x) = \begin{cases} 1 \quad & x \in [0,1] \\ 0 \quad & \text{ otherwise} \end{cases}.$

**Normal Random Variables**

A normal random variable is drawn from the classic "bell curve," the distribution:

$f(x) = \frac{1}{\sqrt{2\pi \sigma^2}} e^{-\frac{(x-\mu)^2}{2\sigma^2}},$

where $\mu$ and $\sigma^2$ are the mean and variance of the distribution, respectively. The peak of the normal distribution is centered at $\mu$ and $\sigma^2$ characterizes the width of the peak.

**Exponential Random Variables**

An exponential random variable is drawn from the distribution:

$f(x) = \lambda e^{-\lambda x},$

where $\lambda$ is the decay rate. This distribution has mean $\frac{1}{\lambda}$ and variance $\frac{1}{\lambda^2}$. Exponential random variables are often useful in measuring the times between events like radioactive decays. In this case the formula for the mean makes sense: the larger the value of $\lambda$, the faster the decay rate and the less time expected on average for one decay to occur.

**Cite as:**Continuous Random Variables - Definition.

*Brilliant.org*. Retrieved from https://brilliant.org/wiki/continuous-random-variables-definition/