Technical Glossary
A home for definitions.
Statistics
Definition 1 Expectation
Definition 2 Moment Generating Function
Notes come from here
The moment of something is the expected value of a power of the random variable:
\[ E[X^k] = \int_{}^{} X^k \operatorname{pdf} dx. \] When we are using the moment generating function (MGF), we are trying to walk around this integral, by instead doing:
\[ E[e^{tX}] = \int_{}^{} e^{tX} \operatorname{pdf} dx. \] Conceptually: we have access to a function that is very easy to expand into a sum of quantities. And, when you expand, then take the correct derivative of that expansion, it will give you the wanted moment. Hence, the term “generating.”
This works because we use the Maclaurin series expansion. The Maclaurin series is just the Taylor series but evaluated at 0.
\[ \begin{aligned} e^{tX} &= 1 + \frac{X}{1!}t + \frac{X^2}{2!}t^2 ... \\ E[e^{tX}] &= 1 + \frac{E[X]}{1!}t + \frac{E[X^2]}{2!}t^2 ... \\ \end{aligned} \]
Looking for the first moment, one can take the first derivative of this expected value. Then,just set \(t\) to be 0, to discard all the remaining terms.
\[ M^{k}_X(0) = \frac{d^kM_X(t)}{dt^k} \rvert_{t=0}. \]
MGFs are neat because, it’s often easier to derive the MGF once (integrate exponential using LOTUS) and compute moments (differentiation), rather than constantly use the integral definition of expectation to get moments.
Definition 3 Wick’s Theorem https://en.wikipedia.org/wiki/Isserlis%27_theorem Wick’s probability theorem is a formula that allows one to compute higher-order moments of the multivariate normal distribution in terms of its covariance matrix
Definition 4 Cauchy Product http://mathonline.wikidot.com/cauchy-product-of-power-series
Definition 5 Linear Discriminant Analysis
Citation
@online{aswani2000,
author = {Aswani, Nishant},
title = {Technical {Glossary}},
date = {2000-01-01},
url = {https://nishantaswani.com/articles/gloss.html},
langid = {en}
}