Give A Dollar

A member of a programming group I manage recently sent a problem that I
thought was interesting. He became interested in this problem when he
fell upon an academic article (see footnote 2.) titled The Statistical
Mechanics of Money.

He reduced the problem described in the article to its simplest terms as
follows:

You have 45 people in a room. You give them each $45. For each "round"
of the game, they give $1 (if they still have it) to a random person.
After 5000 rounds, what does the distribution of the money look like?

The results are quite surprising. I wrote a computer program (see
footnote 1.) in a C-like language that simulated the situation. It
outputs the number of dollars each person ends up with. If you sort the
output from highest dollars to lowest dollars and plot the points you
get something like this:



The article says the expected distribution is exponential. In our
example one person ended up with $202. The next lucky person got $161.
At the other end of the graph we see that about 15 people got about $15
or less.

This problem might be an interesting programming assignment. It's
deceptively simple to state the problem. It would involve arrays,
simulation and sorting.

1. Here is a link to the program I wrote.

give.a.dollar.try.2.tc

2. Here is a link to the academic article.

EPJB-17-723-2000.pdf