Where I work, the company provides a cafeteria where one may purchase lunch. Unfortunately, the lunch is either too expensive or not good enough, depending on your point of view, so we generally eat out. We're lucky that we can do that. Anyway, in general, we try to take turns driving so that we're all likely to drive about equally often. It doesn't always work out that way, but that's the aim.

If we all ate out every meal, it'd be simple; we'd all drive with equal probability. But what happens if, as has been the case occasionally throughout the years I've worked here, one of us can only eat out once per week? How often should that person drive, when they do eat out?

To make things simpler, let's assume that there are two of us daily diner (five times per week), and one single-day diner. Four days out of the week, there are only two diners. If each one drives one-half of the time, then both of them end up driving two days out of the four.

On the last remaining day of the week, when there are three diners, should each drive one-third of the time? Well, if we do things that way, then each of the two diners drives 2-1/3 days, on average, whereas the single-day diner drives just 1/3 day per week, on average. That's not fair, because the two daily diners drive seven times as much as the single-day diner, even though they only eat five times as often. The single-day diner should have to shoulder more of the driving burden on that one day.

Let's denote by

*p*the probability that the single-day diner drives on that day. Then the two daily diners drive on that day with probability (1-

*p*)/2, and over the course of the week, they drive (5-

*p*)/2 days, on average. According to our fairness metric, we must find

*p*such that (5-

*p*)/2 = 5

*p*, which yields

5-

*p*= 10

*p*

11

*p*= 5

*p*= 5/11

So

*the single-day diner should drive nearly half of the time, on those days when he or she joins the two daily diners. By a similar line of reasoning, if there are three daily diners, that probability drops to 5/16, and in general, with*

*n*daily diners, the probability is 5/(1+5

*n*), with each of the daily diners driving five times more often, or 25/(1+5

*n*).

What happens if there are

*m*one-day diners (each of them eating on the same day)? Then the probability

*p*that any of the one-day diners should drive on that one day drops even further, to 5/(

*m*+5

*n*).

One might well consider (providing one is still reading) extending these to

*k*-day diners, and whether the results depend on the

*k*-day diners eating on the same

*k*days, or if the results are insensitive to the distribution of those

*k*days.

## No comments:

## Post a Comment