-
PDF Facts
P[X = x] >= 0
Sum[f(x)] = 1
-
Random Variable
A function that assigns a unique numerical value to each sample space outcome.
-
CDF
-
Given this discrete CDF, what is P[X<=1.5]?
x f(x)
1 0.1
2 0.2
3 0.3
4 0.4
0.1
-
Joint PDF: f (X, Y) for discrete r.v.
P[X = x, Y = y]
-
Joint PDF: f (X, Y) for continous r.v. over a certain range.
-
Marginal PDF: f (x) for discrete r.v.
f(x) = sumy[f(x,y)]
f(y) = sumx[f(x,y)]
-
Marginal PDF: f (X) for continous r.v.
-
Conditional Probability: P[X = x | Y = y] (i.e. f(X|Y))
= P[X = x, Y = y]/P[Y = y] = f(x, y)/f(y)
The conditional probabilty that X equals x given that Y equals y is equal to the joint PDF of X and Y divided by the marginal PDF of y.
-
X, Y are independent IFF
(1) f(x, y) = f(x)f(y) for all x, y.
- Or (derived from (1))
- (2) f(x|y) = f(x),
- (3) f(y|x) = f(y) for all x, y.
-
Given that X, Y are independent show that f(x|y) = f(x) for all x, y.
- We know that
- (i) X,Y are independent if f(x,y) = f(x)f(y) for all x,y.
- (ii) f(x|y) = f(x,y)/f(y)
So by (i) f(x|y) = f(x) f(y)/ f(y) -> f(x|y) = f(x).
-
Expected Value: E[X] for
(1) Discrete
(2) Continous
-
What does Expected Value mean for a discrete random variable?
The average value of X on an infinite number of experimental trials.
For example, E[X] for a six sided fair die is 3.5. Clearly saying something like the likely value of a die role is 3.5 does not make sense. Average makes sense.
-
Let g(X) be a function of X. What is E[g(X)]?
-
 - Remember: even though X is being transformed by g(X), you still use the original PDF f(x).
-
If g(X) equals c, a constant, then E[g(X)] =
E[c] = c
Remember: The Expected Value of a constant is that constant. This allows you to pull constants out of the E[ ] operator.
-
Show that E[c] = c, where c is a constant.
Let g(X) = c -> E[g(X)] = integral[g(x)f(x)dx] = integral[cf(x)dx] = c*integral[f(x)dx] = c(1) = c
-
If g(X) = aX + b, then E[g(X)] =
E[aX + b] = aE[X] + b
-
Show that E[aX + b] = aE[X] + b
Let g(X) = aX + b -> E[g(X)] = E[aX + b] = E[aX] + E[b] -> aE[X] + b
-
Let g(X) = g1(X) + g2(X) + . . . + gn(X),
what is E[g(X)]?
 - Remember: the expected value of the sum is the sum of the expected values.
-
Var[X]
= E[(X - mu) 2] = E[X 2] - mu 2, where mu = E[X].
- Proof:
- E[(X - mu)2] = E[(X - mu)(X - mu)] = E[X2 - 2muX + mu2] = E[X2] - 2muE[X] + mu2 = E[X2] - 2mu*mu + mu2 = E[X2] - 2mu2 + mu2 = E[X2] - mu2
-
Let Y = a + bX, what is Var[Y]?
= b 2Var[X]
- Proof:
- Let Y = a + bX -> E[Y] = a + bE[Y]. Then Var(Y) =

-
Standardized Variable:
z =
- (x - mu)/sig
- where sig = sqrt[Var(X)] (i.e. the standard deviation)
-
Show that E[z] = 0
- E[z] = E[(1/sig)X - mu/sig]
- = (1/sig)E[X] - E[mu/sig]
- = mu/sig - mu/sig = 0
-
Show that Var(z) = 1
- Var(z) = Var[(1/sig)X - mu/sig]
- = (1/sig)2Var(X) = sig2/sig2 = 1
Remember: to get to line two, Var(aX +- b) = a 2Var(X).
-
Let g(X1, X2) have joint PDF f(X1, X2).
E[g(X1, X2)] =
-
cov(X1, X2)
- = E[X1, X2] - mu1mu2
- Derivation:
-

-
If cov(X,Y)
1. > 0
2. < 0
3. = 0
- 1. (X, Y) pairs tend to be both greater than their means or both less than their means.
- 2. (X, Y) pairs tend to be mixed about their means (one greater and one less)
- 3. (X, Y) pairs "evenly" spread about their means.
-
Correlation
-
If p (for correleation)
1. = 1
2. = -1
3. = 0
- 1. perfect positive relation
- 2. perfect negative relation
- 3. no linear relationship
Remember: absolute_value(p) measure the strength of the linear relationship.
-
When X1 and X2 are independent, E[X1, X2]
= mu 1mu 2
- Proof:

-
If X, Y are independent then cov(X, Y)
= 0
Remember: the converse is not true.
- Proof:
- We know E[X, Y] = muXmuY, when X, Y ind.
cov[X, Y] = E[X, Y]-mu Xmu Y = mu Xmu Y-mu Xmu Y = 0
-
Let c, d be constants
E[cX + dY]
= cE[X] + dE[Y] = c*muX + d*muY
-
Var(c1X1 + c2X2)
- c12Var(X1) + c22Var(X2) + 2c1c2Cov(X1, X2)
- Proof:
- Var(c1X1 + C2X2)

-
E[X|Y=y] for continous
- = integral(x*f(x|y)dx)
- Note: f(x|y) = f(x,y)/f(y)
-
-
-
Law of Iterated Expectations
E[Y]
- = Ex[Ey(Y|X)]
- Proof:

-
If E[Y|X] = E[Y] then cov(X,Y)
- = 0
- Proof:
-

-
Var(X|Y)
= E[X2|Y] - E[X|Y]2
-
-
Standard Normal Distribution
Z~(0,1)
-
If X~N(mu,sig2) and Y=aX + b, then Y~
Y~N((a*mu +b), a2sig2)
-
If X,Y are normal then cov(X,Y)
cov(X,Y) = 0 <=> X,Y are independent.
|
|