How to Find Expected Value of Continuous Random Variable

Expected Value of a Continuous Random Variable

Since the density function $f(x)$ is nonnegative, the integral formula for the expectation is really the difference of two integrals with nonnegative integrands (and hence nonnegative value): $$E[X] = \int_{-\infty}^{\infty} xf(x)\mathrm dx = \int_0^{\infty} xf(x)\mathrm dx - \int_{-\infty}^0 \vert x\vert f(x)\mathrm dx. $$ When both integrals are finite, their difference is finite too. If one of the integrals diverges but the other is finite, then some people say $E[X]$ exists but is unbounded while others deny the existence of $E[X]$ and say that $E[X]$ is undefined. (Perhaps this is why many theorems in probability avoid ambiguity by restricting themselves to random variables with finite means instead of random variables whose means exist.) If both integrals diverge, then the integral formula for $E[X]$ gives a result of the form $\infty - \infty$ and everybody agrees that $E[X]$ is undefined.

In summary, if $\int \vert x \vert f(x) dx$ is finite, then $\int x f(x) dx$ is also finite, and the value of the latter integral is called the expectation or expected value or mean of the random variable $X$ and denoted as $E[X]$, that is, $$E[X] = \int_{\infty}^{\infty} x f(x) dx.$$

Added Note: To my mind, the difference between saying that "$E[X] = \int xf(x) dx$ if the integral is finite" (as Sami wants to) and "$E[X] = \int xf(x) dx$ if $\int |x|f(x)\mathrm dx$ is finite" is that the second statement reminds the casual reader to check something instead of jumping to unwarranted conclusions. Many students have mistakenly calculated that a Cauchy random variable with density $[\pi(1+x^2)]^{-1}$ has expected value $0$ on the grounds that the integrand $x\cdot[\pi(1+x^2)]^{-1}$ in the integral for $E[X]$ is an odd function, and the integral is over a interval symmetric about the origin. But they would have discovered the error of their ways if they had carefully checked if $$\int_{-\infty}^{\infty} \vert x \vert \frac{1}{\pi(1+x^2)} dx = 2 \int_0^{\infty} x\frac{1}{\pi(1+x^2)} dx $$ is finite.

Related videos on Youtube

Comments

  • I've been reviewing my probability and statistics book and just got up to continuous distributions. The book defines the expected value of a continuous random variable as:

    $E[H(X)] = \int_{-\infty}^{\infty} H(x)f(x)~dx$

    provided that

    $\int_{-\infty}^{\infty} |H(x)|f(x)~dx$

    is finite.

    While I understand the integral to calculate the expected value, I'm failing to see why the 'provided' takes the absolute value of $H(X)$. Why isn't it enough to just define is as:

    $E[H(X)] = \int_{-\infty}^{\infty} H(x)f(x)~dx$

    provided that it is finite.

    • It's rather a technical thing. Serious theory of probability uses the Lebesgue integral (only with it things get consistent and nice), and Lebesgue integral requires that the integrand is absolutely integrable. You'll rarely -if ever- will find this restriction relevant in practice.

    • @leon, this is not (only) technical and should be taken care of even with integration techniques other than Lebesgue's, as the example of Cauchy random variables (mentioned by @Dilip) shows.

    • @Didier: I said 'technical', not meaning 'merely techical', but rather pointing to its ultimate justification; it's certainly important. See my comment to Dilip.

  • And the reason for the integrand needing to be nonnegative is that it is a requirement of the Lebesgue integral (as leonbloy commented) ?

  • @Sami leonbloy said that the integrand must be absolutely integrable, that is, in order for $g(x)$ to be Lebesgue-integrable, $\vert g(x)\vert$ must be Lebesgue-integrable. It is not necessary that $g(x)$ be nonnegative in order for $g(x)$ to be Lebesgue-integrable. I did not mention Lebesgue integral in my answer, though I did split up the integral for $E[X]$ into the difference of two integrals each of which had a nonnegative integrand.

  • The Cauchy example is very relevant. But I don't think I fully agree with the explanation -specially the second paragraph- in the context of the original question. Let $x$ be a standard normal, take $H(x) = \sin(x) \exp(\frac{x^2}{2})/x$. Then, $E(H(x))$ would be finite if we used the improper Riemann integral.

  • So, it's not completely true that $E(|H(X)|)$ finite is a necessary requirement to get $E(H(X)=\int H(x) f(x)dx$ to converge (it's only necessary when we choose to use the Lebesgue integral).

  • @leonbloy I changed the second paragraph to say that if $\int |x|f(x) dx$ is finite, then $E[X] = \int xf(x) dx$. I do not understand what you mean by

  • I do not understand what point is being made by the statement that "it is not completely true that $E(|H(x)|)$ finite is a necessary requirement to get $\int H(x)f(x)dx$ to converge (It′s only necessary when we choose to use the Lebesgue integral)." In a comment on the original question you had said that "Serious theory of probability uses the Lebesgue integral." Are you implying that for beginning students $E[X^{−1}\sin(X)\exp(X^2/2)]$ exists for $X$ a standard normal random variable because the improper Riemann integral converges to a finite value but for more advanced students, it does not?

Recents

What is the matrix and directed graph corresponding to the relation $\{(1, 1), (2, 2), (3, 3), (4, 4), (4, 3), (4, 1), (3, 2), (3, 1)\}$?

Related

saxtonhambir.blogspot.com

Source: https://9to5science.com/expected-value-of-a-continuous-random-variable

0 Response to "How to Find Expected Value of Continuous Random Variable"

Publicar un comentario

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel