Random Variables: Van Nam Tran
Random Variables: Van Nam Tran
1
University of Technology and Education
If Cov[X, Y ] > 0 then there exists some positive linear relationship between X
and Y so that when X is above (below) average then Y tends to be above
(below) average.
Example
If X is ice cream consumption and Y is the cream consumption would be
associated with above average temperature.
If Cov[X, Y ] < 0, there exists some negative linear relationship between X and
Y so that when X is above (below) average then Y tends to be below (above)
average.
For example, if X is ice cream consumption and Y is rainfall, one would
expect that Cov[X, Y ] < 0, that is above (below) average ice cream
consumption would be associated with below (above) average rainfall.
Van Nam Tran Random variables January, 2018 2/ 15
Definition
The correlation coefficient ρ is
Cov[X, Y ]
ρ= p
V ar[X]V ar[Y ]
Remark
The sign of ρ is always the same as the sign of Cov[X, Y ].
X and Y is called independent if and only if for all functions f (x) and g(x) we
have
E[f (X)]g(Y )] = E[f (X)]E[G(Y )]
Cov[X, X] = V ar[X]
Y = aX1 + bX2 + c.
Then
The matrix
V ar[X1 ] Cov[X1 , X2 ]
Cov[X1 , X2 ] V ar[X2 ]
is positive semi-definite.
Definition
Random variables X1 , X2 , . . . , Xn are called uncorrelated if Cov[Xi , Xj ] = 0
for all i 6= j. otherwise we say that X1 , X2 , . . . , Xn are correlated
Remark
If X1 , X2 , . . . , Xn are independent then they are uncorrelated.
Theorem
Given that X1 , X2 , . . . , Xn are uncorrelated it follows that if
Y = a1 X1 + a2 X2 + · · · + an Xn
then
V ar[Y ] = a21 V ar[X1 ] + a22 V ar[X2 ] + · · · + a2n V ar[Xn ].
Van Nam Tran Random variables January, 2018 8/ 15
The sample mean
Definition
An estimator θ̂ of a parameter θ is said to be unbiased if
E[θ̂] = θ
The classical linear regression model with a constant and one regressor
satisfies the following assumptions:
A2. E[ei ] = 0
Xi − X̄
where ai = n .
X
2
(Xj − X̄)
j=1
Theorem
β̂ is an unbiased estimator of β or E[β̂] = β.
σ2
V ar[β̂] = n
X
(Xj − X̄)2
j=1
Then
n
X
E[Y ] = ai E[Xi ]
i=1
n X
X n
V ar[Y ] = ai aj Cov[Xi , Xj ]
i=1 j=1
Xn n
X n
X
= a2i V ar[Xi ] + 2 ai aj Cov[Xi , Xj ]
i=1 i=1 j=i+1
Theorem
If Y = aT X then
V ar[aT X] = aT V ar[X]a ≥ 0.
If Y = AX then
V ar[AX] = AV ar[X]AT .
Yn 1 Xn en
It turn out that β̂ = (X T X)−1 X T Y . It then can be show that β̂ = β + Ae
where A = (X T X)−1 X T is nonrandom by A4.
V ar[β̂] = σ 2 (X T X)−1