Notes come from here
The moment of something is the expected value of a power of the random variable:
\[
\mathbb{E}[X^k] = \int_{}^{} X^k \operatorname{pdf} dx.
\] When we are using the moment generating function (MGF), we are trying to walk around this integral, by instead doing:
\[
\mathbb{E}[e^{tX}] = \int_{}^{} e^{tX} \operatorname{pdf} dx.
\]
We do this because we use the Maclaurin series expansion to approximate the exponential function. The Maclaurin series is just the Taylor series but evaluated at 0.
\[
\begin{aligned}
e^{tX} &= 1 + \frac{X}{1!}t + \frac{X^2}{2!}t^2 ... \\
\mathbb{E}[e^{tX}] &= 1 + \frac{\mathbb{E}[X]}{1!}t + \frac{\mathbb{E}[X^2]}{2!}t^2 ... \\
\end{aligned}
\]
Looking for the first moment, one can take the first derivative of this expected value. Then, just set \(t=0\) and discard all the remaining terms.
Conceptually then, we have access to a function that is very easy to expand into a sum of quantities. And, when you expand, then take the correct derivative of that expansion, it will give you the desired moment. Hence, the term “generating.” This is what that looks like symbolically:
\[
M^{k}_X(0) = \frac{d^kM_X(t)}{dt^k} \rvert_{t=0}.
\]
MGFs are neat because, it’s often easier to derive the MGF once (integrate exponential using LOTUS) and compute moments (differentiation), rather than constantly use the integral definition of expectation to get moments.