%%visits: 2
MG := Given a random_variable X, it’s K-th moment, 𝔼[Xk], if the
expectation is well defined. The MGF is defined as 𝔼[etx]
## intuition Used to derive moments and it is an alternative way to find
values of a distribution. The moment_generating_function comes from the
taylor expansion, so when derivates are taken
Uniqueness. If two rvs have the same moment_generating_function, then they have the same distribution
Properties: - a,b R, Xrv, y= a+bx, ϕy(t) = eatϕx(bt) - X,Y independant, $\phi_{X+Y}(t `:=` \mathbb{E}\left[ e^{t(X+Y} \right] = \mathbb{E}\left[ e^{tx}e^{ty} \right] = \phi_X(t)\phi_Y(t)$ which is true for multiple i.d.s
Binomial distribution is just a series of n Bernoulli’s distribution which is expected dude to the ids Properties of a mgf.
Exponential family of equations. - Bernoulli’s, it is a 1-paremeter distribution - Normal - The reason why we go from standard to any normal by $\frac{rv - mean}{standard}$ is because, of the mgf described above, and how the mgf is :todo: ## rigour