Random Variables

It is a variable that has unknown/random quantity of interest.

${\mathcal X}$: the set of all possible values. Also called a sample space or static space.

Discrete Random Variables

Sample space is finite/countably infinite (${\mathcal X}$). Probability of event X = x is denoted by $P(X=x)$ or $P(x)$ for short.

  • x: Random Variable (ex: heads/tails)
  • ${\mathcal X}$: Sample Space (ex: S={Heads, Tails})
  • x (small x) -> x: one of the possible values (ex: x=heads, or x = tails)

Continuous Random Variables

If x (a random variable) ${\in\mathbb{R}}$ is a real-valued quantity, it is called a continous random variable.

With continuous random variables, $P(X=x) = 0$ since there are infinitely many values that x can take on.

  • Proof as to why it’s zero: $\lim{x \to \infty} \sum_{i=0}^\infty\frac{n}\infty = \frac{1}{\infty} = 0$

Marginal Distribution

Assume we have two random variables x and y (both take on binary values (ex: 0, 1)) with finite cardinality, then we can represent their joint probability w/ a 2-dimensional table.

NOTE: random variables don’t have to take on binary values; as long as x and y have finite cardinality, it is valid.

P(X,Y) y = 0 y = 1
x = 0 0.4 0.2
x = 1 0.3 0.1
  1. $P(x = 0, y = 0) = 0.4$
  2. $P(x = 0, y = 1) = 0.2$
  3. $P(x = 1, y = 0) = 0.3$
  4. $P(x = 1, y = 1) = 0.1$

Given a joint distribution, the marginal distribution can be computed as follows: $P(x = 0) = P(x=0, y=0) \cup P(x=0, y=1) = 0.4 + 0.2 = 0.6$ $P(y = 1) = P(x=0, y=1) \cup P(x=1, y=1) = 0.2 + 0.1 = 0.3$

Formula for $P(X=x)$ $P(X=x) = \sum{_y}P(X=x, Y=y)$ This sums up joint prob for y. $P(Y=y) = \sum{_x}P(X=x, Y=y)$ This sums up joint prob for x.

$= {P(X=0) = \sum{_y}P(X=0, Y=y)}$ $= P(X=0, Y=0) + P(X=0, Y=1)$ $= 0.4 + 0.2 = 0.6$

With Conditional Probability

$P(X | Y) = \frac{P(X \cap Y)}{P(Y)} $ $P(X=x | Y=y) = \frac{P(X=x, Y=y)}{P(Y=y)}$ top is joint probability; marginal probabiltiy $P(X=x | Y=y) = \frac{P(X=x, Y=y)}{\sum{_x}P(X=x, Y=y)}$

Moments of a Distribution

  • Mean (expected value)

For discrete random variables: $\mu = \mathbb{E}[x] = {\sum_{\mathcal X} x*P(x)} $

Learn more about expected mean.

Variance

${\sigma^2} = \mathbb{E}[(x - \mu^2)]$ = ${\mathbb{E}[x^2 - 2x\mu + \mu^2]}$ = ${\mathbb{E}[x^2] - \mathbb{E}[2x, \mu] + \mathbb{E}[\mu^2]}$ = ${\mathbb{E}[x^2] - 2\mu\mathbb{E}[x] + \mu^2 }$ = ${\mathbb{E}[x^2]-2\mu^2 +\mu^2}$ = ${\mathbb{E}[x^2]-\mu^2 = \mathbb{E}[x^2] - (\mathbb{E}[x])^2 }$

Standard Deviation

${\sigma = \sqrt{variance}}$

  • Variance and std. deviation calculates above apply to the entire population and if we are doing with a sample from the population, the denominator is n-1 instead of n.

Baye’s Rule

Combining the definition of the conditional probability with the product rule and the sum rule yields and Baye’s Rule (aka Baye's Theorem).

Formula: $P(H=h | Y=y) = \frac{P(Y=y | H=h) * P(H=h)}{\sum{_h}P(Y=y|H=h)*P(H=h)}$