‹for Independent Continuous Random Variables X and Y Show Exy Ex ·ey

Two random variables are independent if they convey no information about each other and, as a consequence, receiving information about one of the two does not change our assessment of the probability distribution of the other.

This lecture provides a formal definition of independence and discusses how to verify whether two or more random variables are independent.

Table of Contents

Table of contents

  1. Definition

  2. Independence criterion

  3. Independence between discrete random variables

  4. Independence between continuous random variables

  5. More details

    1. Mutually independent random variables

    2. Mutual independence via expectations

    3. Independence and zero covariance

    4. Independent random vectors

    5. Mutually independent random vectors

  6. Solved exercises

    1. Exercise 1

    2. Exercise 2

    3. Exercise 3

Recall (see the lecture entitled Independent events) that two events A and $B$ are independent if and only if [eq1]

This definition is extended to random variables as follows.

Definition Two random variables X and Y are said to be independent if and only if [eq2] for any couple of events [eq3] and [eq4] , where $Asubseteq $ R and $Bsubseteq $ R .

In other words, two random variables are independent if and only if the events related to those random variables are independent events.

The independence between two random variables is also called statistical independence.

Checking the independence of all possible couples of events related to two random variables can be very difficult. This is the reason why the above definition is seldom used to verify whether two random variables are independent. The following criterion is more often used instead.

Proof

Example Let X and Y be two random variables with marginal distribution functions [eq13] and joint distribution function [eq14] X and Y are independent if and only if [eq15] which is straightforward to verify. When $x<0$ or $y<0$ , then [eq16] When $xgeq 0$ and $ygeq 0$ , then: [eq17]

When the two variables, taken together, form a discrete random vector, independence can also be verified using the following proposition:

The following example illustrates how this criterion can be used.

Example Let [eq22] be a discrete random vector with support [eq23] Let its joint probability mass function be [eq24] In order to verify whether X and Y are independent, we first need to derive the marginal probability mass functions of X and Y . The support of X is [eq25] and the support of Y is [eq26] We need to compute the probability of each element of the support of X : [eq27] Thus, the probability mass function of X is [eq28] We need to compute the probability of each element of the support of Y : [eq29] Thus, the probability mass function of Y is [eq30] The product of the marginal probability mass functions is [eq31] which is obviously different from [eq32] . Therefore, X and Y are not independent.

When the two variables, taken together, form a continuous random vector, independence can also be verified by means of the following proposition.

The following example illustrates how this criterion can be used.

Example Let the joint probability density function of X and Y be [eq37] Its marginals are [eq38] and [eq39] Verifying that [eq40] is straightforward. When [eq41] or [eq42] , then [eq43] . When [eq44] and [eq45] , then [eq46]

The following subsections contain more details about statistical independence.

Mutually independent random variables

The definition of mutually independent random variables extends the definition of mutually independent events to random variables.

Definition We say that n random variables X_1 , ..., X_n are mutually independent (or jointly independent) if and only if [eq47] for any sub-collection of k random variables $X_{i_{1}}$ , ..., $X_{i_{k}}$ (where $kleq n$ ) and for any collection of events [eq48] , where [eq49] .

In other words, n random variables are mutually independent if the events related to those random variables are mutually independent events.

Denote by X a random vector whose components are X_1 , ..., X_n . The above condition for mutual independence can be replaced:

  1. in general, by a condition on the joint distribution function of X : [eq50]

  2. for discrete random variables, by a condition on the joint probability mass function of X : [eq51]

  3. for continuous random variables, by a condition on the joint probability density function of X : [eq52]

Mutual independence via expectations

It can be proved that n random variables X_1 , ..., X_n are mutually independent if and only if [eq53] for any n functions $g_{1}$ , ..., $g_{n}$ such that the above expected values exist and are well-defined.

Independence and zero covariance

If two random variables X_1 and X_2 are independent, then their covariance is zero: [eq54]

Proof

The converse is not true: two random variables that have zero covariance are not necessarily independent.

Independent random vectors

The above notions are easily generalized to the case in which X and Y are two random vectors, having dimensions $K_{X}	imes 1$ and $K_{Y}	imes 1$ respectively. Denote their joint distribution functions by [eq60] and [eq61] and the joint distribution function of X and Y together by [eq62] Also, if the two vectors are discrete or continuous replace F with p or $f$ to denote the corresponding probability mass or density functions.

Mutually independent random vectors

Also the definition of mutual independence extends in a straightforward manner to random vectors.

Definition We say that n random vectors X_1 , ..., X_n are mutually independent (or jointly independent) if and only if [eq71] for any sub-collection of k random vectors $X_{i_{1}}$ , ..., $X_{i_{k}} $ (where $kleq n$ ) and for any collection of events [eq72] .

All the equivalent conditions for the joint independence of a set of random variables (see above) apply with obvious modifications also to random vectors.

Below you can find some exercises with explained solutions.

Exercise 1

Consider two random variables X and Y having marginal distribution functions [eq73] If X and Y are independent, what is their joint distribution function?

Solution

For X and Y to be independent, their joint distribution function must be equal to the product of their marginal distribution functions: [eq74]

Exercise 2

Let [eq75] be a discrete random vector with support: [eq76] Let its joint probability mass function be [eq77] Are X and Y independent?

Solution

In order to verify whether X and Y are independent, we first need to derive the marginal probability mass functions of X and Y . The support of X is [eq78] and the support of Y is [eq79] We need to compute the probability of each element of the support of X : [eq80] Thus, the probability mass function of X is [eq81] We need to compute the probability of each element of the support of Y : [eq82] Thus, the probability mass function of Y is [eq83] The product of the marginal probability mass functions is [eq84] which is equal to [eq32] . Therefore, X and Y are independent.

Exercise 3

Let [eq86] be a continuous random vector with support [eq87] and its joint probability density function be [eq88] Are X and Y independent?

Solution

The support of Y is [eq89] When [eq90] , the marginal probability density function of Y is 0 , while, when [eq91] , the marginal probability density function of Y is [eq92] Thus, summing up, the marginal probability density function of Y is [eq93] The support of X is [eq94] When [eq95] , the marginal probability density function of X is 0 , while, when [eq96] , the marginal probability density function of X is [eq97] Thus, the marginal probability density function of X is [eq98] Verifying that [eq40] is straightforward. When [eq95] or [eq101] , then [eq43] . When [eq103] and [eq104] , then [eq105] Thus, X and Y are independent.

Please cite as:

Taboga, Marco (2021). "Independent random variables", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/independent-random-variables.

saxtonhambir.blogspot.com

Source: https://www.statlect.com/fundamentals-of-probability/independent-random-variables

0 Response to "‹for Independent Continuous Random Variables X and Y Show Exy Ex ·ey"

Publicar un comentario

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel