THE REINSURANCE ACTUARY
  • Blog
  • Project Euler
  • Category Theory
  • Disclaimer

Uniqueness of Moment Generating Functions

18/10/2016

 

Most undergraduate probability textbooks make extensive use of the result that each random variable has a unique Moment Generating Function.  In particular, we can use this result to demonstrate the effect of adding or multiplying random variables.  For example, the proof that the sum of two Poisson Random Variables is also a Poisson Random Variable (with mean equal to the sum of the means of the two Poissons) is much easier if we can invoke this result.
​
Proof for the sum of two Poisson distributions:
Suppose $X$ and $Y$ are independent Poisson Distributions with parameters $\lambda_1$ and $\lambda_2$.
We know that the $MGF$ of a Poisson Distribution with parameter $\lambda$ is $e^{ \lambda ( e^t - 1)) }$.

​We can now use the result that:
$MGF_{X+Y} = MGF_X * MGF_Y = e^{ \lambda_1 ( e^t - 1)) } * e^{ \lambda_2 ( e^t - 1)) } = e^{ (\lambda_1 +\lambda_2) ( e^t - 1)) }$

Which is the $MGF$ of a Poisson Distribution with distribution $\lambda_1 + \lambda_2$ proving the result.

Do we need a proof?

Call me pedantic, but I never liked the fact that this uniqueness result is usually stated without proof. For me, there was always an elegance to a textbook that began with definitions and axioms, and possibly a handful of weak results which were taken as given, and then proved everything required on the way. In my experience Probability and Stats books seem particularly prone to not being self-contained and rigorous in this way. I suspect it has something to do with the fact that probability and stats, if done 100% rigorously, are both extremely technical and difficult! It would be a shame if we had to wait until we had mastered complex analysis and measure theory before we could learn about predicting the number of black and white balls in an urn. (I'm joking there are actually interesting and useful parts to probability once you get past of those boring exercises about urns)

I think my discomfort is added to by the fact that there is also something slightly disingenuous in using a very powerful result to prove a fairly trivial special case without really understanding the general result you are using. If you do this too much, you never really understand why anything is true, just that it is true.

Not only is this proof not often given, I had to look pretty hard to find it anywhere..

The general proof for all random variables requires either measure theory or complex analysis and is quite involved, therefore I thought I'd just write up the result for discrete random variables.

So here is a proof of the uniqueness of $MGFs$ for discrete random variables over the support $\mathbb{N}_0$.
​
Uniqueness of MGFs

Suppose $X$ and $Y$ are discrete random variables over $\{ 0 , 1, 2, ... \} $.

Further suppose that:
$$MGF_X = MGF_Y$$

That is:
$$\forall t \in \mathbb{R} , \sum_{i=0}^{\infty} e^{ti} f_X (i) = \sum_{j=0}^{ \infty } e^{tj} f_Y (j)$$

Then,

$$\sum_{i=0}^{\infty} e^{tx_i} f_X (i) - \sum_{j=0}^{ \infty } e^{tj} f_Y (j) = 0$$

Changing the range for the second sum: $$\sum_{i=0}^{\infty} e^{ti} f_X (i) - \sum_{i=0}^{ \infty } e^{ti} f_Y (i) = 0 $$

Bringing the two sums together: $$\sum_{i=0}^{\infty} ( e^{ti} f_X (i) - e^{ti} f_Y (i) ) = 0 $$

Rearranging:

$$\sum_{i=0}^{\infty} e^{ti}(f_X (i) - f_Y (i) ) = 0 $$


We can now think of this as a power series with coefficients:
​$$g(i) = f_X (i) -  f_Y (i)$$
i.e.
$$\sum_{i=0}^{\infty} (e^t)^i g(i) = 0 $$
Which allows us to use the result that a power series which is equal to $0$ in some interval (in this case $\mathbb{R}$), must have coefficients equal to $0$ on the interval.

To see this, we consider the $nth$ derivative, which allows us to recover the $nth$ coefficient.:

$$​f^{(n)}(0) = \sum_{k=n}^\infty \frac{k!}{(k-n)!}c_k0^{k-n} = n!c_{n}$$

​Which gives us our result since $g(i)=0$ implies that the two functions $f_X$ and $f_Y$ are equal.

Anders Berthelsen
28/11/2016 03:53:12 pm

Not sure I get the last step of your proof. How can you know that fX-fY is always positive? If it changes sign, then e^(tx) positive, does not necessarily imply that fX-fY = 0.

Reply
Arash Jamshidi
7/4/2019 03:35:35 pm

I have the same question as Anders Berthelsen. how can you know fX-fY is always positive? from my point of view another possibility is that some of fX-fY s are negative and some of them are positive and they add to 0. please explain further.
Thanks!

Reply
Arash Jamshidi
7/4/2019 03:36:58 pm

i meant "how can you know fX-fY is always zero" *

Reply
Arash Jamshidi
7/4/2019 04:36:31 pm

nevermind, i got it. all coefficients of polynomial function that equal zero, equals zero. for anyone wondering why, you can easily prove it by taking derivative n times (the last coefficients will equal zero now work backward and all coefficients are zero.)
again, thanks for your proof.

Reply

Your comment will be posted after it is approved.


Leave a Reply.

    Author

    ​​I work as an actuary and underwriter at a global reinsurer in London.

    I mainly write about Maths, Finance, and Technology.
    ​
    If you would like to get in touch, then feel free to send me an email at:

    ​LewisWalshActuary@gmail.com

      Sign up to get updates when new posts are added​

    Subscribe

    RSS Feed

    Categories

    All
    Actuarial Careers/Exams
    Actuarial Modelling
    Bitcoin/Blockchain
    Book Reviews
    Economics
    Finance
    Forecasting
    Insurance
    Law
    Machine Learning
    Maths
    Misc
    Physics/Chemistry
    Poker
    Puzzles/Problems
    Statistics
    VBA

    Archives

    March 2023
    February 2023
    October 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    October 2021
    September 2021
    August 2021
    July 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020
    August 2020
    May 2020
    March 2020
    February 2020
    January 2020
    December 2019
    November 2019
    October 2019
    September 2019
    April 2019
    March 2019
    August 2018
    July 2018
    June 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017
    December 2016
    November 2016
    October 2016
    September 2016
    August 2016
    July 2016
    June 2016
    April 2016
    January 2016

  • Blog
  • Project Euler
  • Category Theory
  • Disclaimer