Refresher of the very basics of probability theory.
Random variable \(X\): Mapping between events (or outcomes) and real values that ease probability calculus.
Probability distribution \(P_X\): Function that provides occurrence probability of possible outcomes. 1
Probablity distributions are often expressed as:
- PMF (Probability Mass Function): Probability of each point in a discrete rv.
- PDF (Probability Density Function): Likelihood of each point in a continuous rv. Or probability spread over area. 2
Expectation: The expected vaule of a random variable \(X\) is the weighted average of the distribution (aka center of mass).
- In discrete rv: \(E [X] = \sum p(x) x\)
- In continuous rv: \(E [X] = \int p(x) x dx\) 3
Variance: Expectation of the squared deviation of \(X\): \(V(X) = E[(X - E[X])^2]\)
Independence: We say events \(X\), \(Y\) are independent when the occurrence of \(X\) doesn’t affect \(Y\): \(p(X \cap Y) = p(X) \cdot p(Y)\)
Bayes Theorem:
1 Check here for a more formal definition
2 Notation goes: \(P(A)\)-> probability of event \(A\), \(p(x) = P(X = x)\) -> defines the PMF
3 I purposly use loose notation to ease the reading, more strictly: \(E [X] = \sum_{x \in X} P(X = x) \cdot x\)
\[\begin{equation} p(Y | X) = \frac{p(X | Y) p(Y)}{p(X)} \end{equation}\]
Naming goes: Posterior: \(p(y \mid x)\). Likelihood: \(p(x \mid y)\). Prior: \(p(y)\). Evidence: \(p(x)\).