Deriving the Expectation of a Continuous Random Variable Using Cumulative Distribution Function

Deriving the Expectation of a Continuous Random Variable Using Cumulative Distribution Function

In this article, we will walk through the process of showing that for a continuous random variable $X$ with cumulative distribution function (CDF) $F_x$, the expectation of $X$ can be represented as:

[ EX int_{0}^{infty} (1 - F_x) dx - int_{-infty}^{0} F_x dx ]

This involves breaking the proof into several steps, applying integration by parts, and understanding the properties of the CDF. Let's dive into the details.

Step 1: Expectation of a Continuous Random Variable

For a continuous random variable $X$ with probability density function (PDF) $f_x$, the expectation $E[X]$ is given by:

[ E[X] int_{-infty}^{infty} x f_x dx ]

Step 2: Splitting the Integral

We can split this integral into two parts, one for positive values and one for negative values:

[ E[X] int_{-infty}^{0} x f_x dx int_{0}^{infty} x f_x dx ]

Step 3: Expressing the Integrals in Terms of the CDF

Let's start with the positive part:

Integration for Positive Values (x 0)

For $x geq 0$, we use integration by parts. Let:

$u x$ which gives $du dx$ $dv f_x dx$ which gives $v F_x$

Then we have:

[int_{0}^{infty} x f_x dx left[x F_xright]_{0}^{infty} - int_{0}^{infty} F_x dx]

Boundary Term at Infinity: As $x to infty$, $F_x to 1$ and $x F_x to 0$, so the boundary term vanishes.

Boundary Term at x0: At $x0$, $F_0$ is finite, so:

[left[x F_xright]_{0}^{infty} 0 - 0 0]

Thus, we have:

[int_{0}^{infty} x f_x dx - int_{0}^{infty} F_x dx]

Integration for Negative Values (x 0)

Now consider the negative part:

[int_{-infty}^{0} x f_x dx]

Using integration by parts again, let:

$u x$ which gives $du dx$ $dv f_x dx$ which gives $v F_x$

Then we have:

[int_{-infty}^{0} x f_x dx left[x F_xright]_{-infty}^{0} - int_{-infty}^{0} F_x dx]

Boundary Term at x0: At $x0$, $F_0$ is finite, so:

[left[x F_xright]_{-infty}^{0} 0 - 0 0]

Boundary Term at x 0: As $x to -infty$, $F_x to 0$, so:

[left[x F_xright]_{-infty}^{0} 0 - 0 0]

Thus, we have:

[int_{-infty}^{0} x f_x dx - int_{-infty}^{0} F_x dx]

Step 4: Combining the Results

Now substituting both results back into the expectation:

[E[X] - int_{0}^{infty} F_x dx - int_{-infty}^{0} F_x dx]

This can be rearranged as:

[E[X] - int_{0}^{infty} F_x dx - int_{-infty}^{0} F_x dx]

Since for $x geq 0$, $F_x 1 - F_{-x}$, we can rewrite the expression as:

[E[X] int_{0}^{infty} (1 - F_x) dx - int_{-infty}^{0} F_x dx]

Thus, we conclude:

[EX int_{0}^{infty} (1 - F_x) dx - int_{-infty}^{0} F_x dx]

This derivation shows a deeper understanding of the relationship between the expectation of a random variable and its cumulative distribution function, a fundamental concept in probability theory and statistics.