‹for Independent Continuous Random Variables X and Y Show Exy Ex ·ey
Two random variables are independent if they convey no information about each other and, as a consequence, receiving information about one of the two does not change our assessment of the probability distribution of the other.
This lecture provides a formal definition of independence and discusses how to verify whether two or more random variables are independent.
Table of contents
-
Definition
-
Independence criterion
-
Independence between discrete random variables
-
Independence between continuous random variables
-
More details
-
Mutually independent random variables
-
Mutual independence via expectations
-
Independence and zero covariance
-
Independent random vectors
-
Mutually independent random vectors
-
Solved exercises
-
Exercise 1
-
Exercise 2
-
Exercise 3
Recall (see the lecture entitled Independent events) that two events and are independent if and only if
This definition is extended to random variables as follows.
Definition Two random variables and are said to be independent if and only if for any couple of events and , where and .
In other words, two random variables are independent if and only if the events related to those random variables are independent events.
The independence between two random variables is also called statistical independence.
Checking the independence of all possible couples of events related to two random variables can be very difficult. This is the reason why the above definition is seldom used to verify whether two random variables are independent. The following criterion is more often used instead.
Proof
Example Let and be two random variables with marginal distribution functions and joint distribution function and are independent if and only if which is straightforward to verify. When or , then When and , then:
When the two variables, taken together, form a discrete random vector, independence can also be verified using the following proposition:
The following example illustrates how this criterion can be used.
Example Let be a discrete random vector with support Let its joint probability mass function be In order to verify whether and are independent, we first need to derive the marginal probability mass functions of and . The support of is and the support of is We need to compute the probability of each element of the support of : Thus, the probability mass function of is We need to compute the probability of each element of the support of : Thus, the probability mass function of is The product of the marginal probability mass functions is which is obviously different from . Therefore, and are not independent.
When the two variables, taken together, form a continuous random vector, independence can also be verified by means of the following proposition.
The following example illustrates how this criterion can be used.
Example Let the joint probability density function of and be Its marginals are and Verifying that is straightforward. When or , then . When and , then
The following subsections contain more details about statistical independence.
Mutually independent random variables
The definition of mutually independent random variables extends the definition of mutually independent events to random variables.
Definition We say that random variables , ..., are mutually independent (or jointly independent) if and only if for any sub-collection of random variables , ..., (where ) and for any collection of events , where .
In other words, random variables are mutually independent if the events related to those random variables are mutually independent events.
Denote by a random vector whose components are , ..., . The above condition for mutual independence can be replaced:
-
in general, by a condition on the joint distribution function of :
-
for discrete random variables, by a condition on the joint probability mass function of :
-
for continuous random variables, by a condition on the joint probability density function of :
Mutual independence via expectations
It can be proved that random variables , ..., are mutually independent if and only if for any functions , ..., such that the above expected values exist and are well-defined.
Independence and zero covariance
If two random variables and are independent, then their covariance is zero:
Proof
The converse is not true: two random variables that have zero covariance are not necessarily independent.
Independent random vectors
The above notions are easily generalized to the case in which and are two random vectors, having dimensions and respectively. Denote their joint distribution functions by and and the joint distribution function of and together by Also, if the two vectors are discrete or continuous replace with or to denote the corresponding probability mass or density functions.
Mutually independent random vectors
Also the definition of mutual independence extends in a straightforward manner to random vectors.
Definition We say that random vectors , ..., are mutually independent (or jointly independent) if and only if for any sub-collection of random vectors , ..., (where ) and for any collection of events .
All the equivalent conditions for the joint independence of a set of random variables (see above) apply with obvious modifications also to random vectors.
Below you can find some exercises with explained solutions.
Exercise 1
Consider two random variables and having marginal distribution functions If and are independent, what is their joint distribution function?
Solution
For and to be independent, their joint distribution function must be equal to the product of their marginal distribution functions:
Exercise 2
Let be a discrete random vector with support: Let its joint probability mass function be Are and independent?
Solution
In order to verify whether and are independent, we first need to derive the marginal probability mass functions of and . The support of is and the support of is We need to compute the probability of each element of the support of : Thus, the probability mass function of is We need to compute the probability of each element of the support of : Thus, the probability mass function of is The product of the marginal probability mass functions is which is equal to . Therefore, and are independent.
Exercise 3
Let be a continuous random vector with support and its joint probability density function be Are and independent?
Solution
The support of is When , the marginal probability density function of is , while, when , the marginal probability density function of is Thus, summing up, the marginal probability density function of is The support of is When , the marginal probability density function of is , while, when , the marginal probability density function of is Thus, the marginal probability density function of is Verifying that is straightforward. When or , then . When and , then Thus, and are independent.
Please cite as:
Taboga, Marco (2021). "Independent random variables", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/independent-random-variables.
Source: https://www.statlect.com/fundamentals-of-probability/independent-random-variables
0 Response to "‹for Independent Continuous Random Variables X and Y Show Exy Ex ·ey"
Publicar un comentario