Random Variable

A random variable can be thought of as a function which assigns a real number to events in a sample space.

Consider a coin toss. The sample space consists of two possible outpcomes: . Let be the number of Heads. This is a random variable with possible values and , i.e.

The probability of an outcome is described in terms of the probability of a random variable taking a given value. For example, .

Random variables can be either discrete or continuous.

Probability Distribution

A probability distribution is a function that gives the likelihood of different possible outcomes for a random variable.

Probability Mass Function

The probability mass function (PMF) of a discrete random variable is the function given by . That is, it is a function that gives the probability that the random variable will take some value . A PMF must satisfy the following criteria:

  1. for all .

Cumulative Distribution Function

The distribution of a random variable can also be defined using the cumulative distribution function (CDF). The CDF is defined for all random variables: discrete as well as continuous.

The cumulative distribution function (CDF) of a random variable is the function given by with following properties:

  1. Increasing: If , then .
  2. Right-continuous: Except for possibly having jumps, the CDF is continuous. Wherever there is a jump, the CDF is continuous from the right. That is,
  3. Convergence to 0 and 1 in the limits:

Probability Density Function

The probability density function (pdf) of a continuous random variable with CDF is the derivative of the CDF, given by , and satisfies the following criteria:

Function of a Random Variable

A function of a random variable is also a random variable.

Consider an experiment with a sample space , a random variable , and a function . Then, is a random variable that maps to for all .

Let be a discrete random variable and , then the PMF of is,

Let be a continuous random variable and be a function of . Then, the CDF of is given by,

where, is the CDF of .

Now, the PMF of is given by,

Expectation

The distribution of a random variable gives all the information about the probability that a random variable will fall into any particular set. However, it would be better if we had just one number summarizing the average value of the random variable.

The average (or mean) of a random variable is known as its expected value.

The expected value of a discrete random variable with distinct possible values , and PMF is defined as

If the possible values are finite then it is defined as

The expected value of a continuous random variable with PDF is defined as

Thus, the expectation of a random variable is the weighted average of its possible values, weighted by their probability of occurrence.

Indicator Random Variable

For an event , the indicator random variable, or is if occurs, and otherwise:

The probability of an event can be expressed as the expectation of the indicator random variable of :

Law of Unconscious Statistician

The expected value of a function of a random variable , , can be determined using the Law of Unconscious Statistician (LOTUS).

For Discrete Random Variable

Let be a discrete random variable with probability mass function , then the expected value of is

Proof

Let and be its PMF. Then,

For Continuous Random Variable

Let be a continuous random variable with probability mass function , then the expected value of is

Proof

Let with probability density function . Then,

Conditional Expectation

Law of Total Expectation

References

Contents

ArkTetra © 2026