Posted by: Steven Sam | February 26, 2009

Combinatorial approach to e

I plan to continue my discussion of Boij–Söderberg theory from last time, but I’d like to make a quick detour first. This came up today in the office when I was talking with yanzhang.

Question: If I randomly pick numbers (with a uniform distribution) from the unit interval [0,1], what is the expected number of numbers that I need to pick before their sum is at least 1?

The answer for some strange reason is the number e, and I’ll illustrate this with two approaches. Both give different expressions of e, so this problem will show an equivalence between two definitions of e.

The first approach is to use calculus. For 0 \le r \le 1, let f(r) be the expected number of randomly selected numbers in [0,1] so that their sum is at least r, and define f(r) = 0 if r is negative. Then we have the following integral equation

\displaystyle f(r) = 1 + \int_0^r f(r-t)\, dt = 1 + \int_0^r f(s)\, ds.

To see this, pick a number t randomly in [0,1], and then add f(r-t). Since we need to account for all such t, we just integrate over [0,r] (I could integrate over [0,1], but remember that f(r-t) = 0 if t>r), and divide this integral by the length of [0,1], which is 1. The second inequality follows from a change of variables s=r-t. Taking derivatives of both sides, we get f'(r) = f(r) by the fundamental theorem of calculus, so f(r) = ce^r for some constant c. Of course, f(0) = 1, so c=1. Our original question was for r=1, in which case f(1) = e, as we promised.

The second approach is the same, except we discretize everything. Let’s ask an analogous question. First fix k. If I randomly pick integers uniformly from [0,k], what is the expected number that I need before their sum is at least n, where n \le k? Let g(n) be this expected value. Then again, we have

\displaystyle g(n) = 1 + \frac{1}{k+1} \sum_{i=0}^n g(i)

by the same reasoning as before. Let \Delta be the first difference operator, i.e., \Delta f(x) = f(x) - f(x-1). Now apply \Delta to the above equation to get

\displaystyle g(n) - g(n-1) = \frac{1}{k+1} (g(n) - g(-1)).

Of course, g(-1) = 0, and rewriting this last equation gives

\displaystyle g(n) = \frac{k+1}{k} g(n-1)

assuming that n>0. Since g(0) = 1, this means that g(n) = \left( \frac{k+1}{k} \right)^n. Now set k=n: If we rephrase this question, we also see that g(n) is the expected number of random selections of rational numbers with denominator n from [0,1] such that their sum is at least 1. Letting n go to infinity, we should get the answer from the previous method since the rationals form a dense subset of the reals.

So these two approaches show the equivalence of two definitions of e: one as the unique number such that \displaystyle \frac{d}{dx} e^x = e^x, and the other as the limit \displaystyle \lim_{n\to \infty} \left( \frac{n+1}{n} \right)^n.

Do you know of any other solutions to the original question?

-Steven

Advertisements

Responses

  1. Let X_j be iid uniform in [0,1] random variables and S_n = X_1 + ... + X_n. Then

    For s < 1, assume P(S_n < s) = s^{n} / n!

    Then
    P(S_{n+1} < s) = P(S_n + X_{n+1} < s) = \int_{0}^{1} P(S_n < s - t) dt = \int_{0}^{s} P(S_n < s - t) dt = \int_{0}^{s} P(S_n = 1. Then P(N > n) = P(S_n  n) = \sum_{1}^{\infty} 1/n! = e

  2. Nice article! BTW, isn’t the second inequality actually an equality?

  3. I propose the following proof : let (X_i) be a sequence of i.i.d. uniformly distributed random variables, and S_k = X_1 + \cdots + X_k. We need the expected value of the minimal k such that S_k \geq 1, and this is easily proved to equal the sum over all positive n of the probability P(n) that k is greater than or equal to n.

    For each n, P(n) is the probability that the sum of n numbers in [0,1] is less than 1. Then P(n) is the volume of the simplex \sum x_i \leq 1, which is easily proved to be 1/n! (by induction on n for example).

    You thus obtain another equivalent definition of e, the one with power series.

  4. Solution 3: Let p(k) be the probability that, after choosing k numbers, the sum is still less than 1. So the probability that we will pick exactly k numbers is p(k)-p(k+1) and the expected number of picks is

    sum k (p(k)-p(k+1)) = sum p(k)

    Now, p(k) is just the volume of the pyramid

    {(x_1, …, x_k) : 0 \leq x_1, .., x_k, x_1+x_2+…+x_k \leq 1}

    This is 1/k!.So the expected number of picks is
    sum 1/k!

  5. […] Combinatorial approach to e I plan to continue my discussion of Boij–Söderberg theory from last time, but I’d like to make a quick detour […] […]

  6. This is a rather cool way of extracting e from the uniform [0,1] distribution. Of course, we can obtain pi by considering a related problem: what is the probability that the sum of the squares two U[0,1] random variables is less than one? (Answer: pi/4)

    My question is thus: is it possible to do something similar to extract the Euler-Mascheroni constant from the uniform [0,1] distribution?


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Categories

%d bloggers like this: