Showing posts with label coins. Show all posts
Showing posts with label coins. Show all posts

Thursday, July 15, 2010

A Tale to Tell

People love to tell stories. It's something that I think is fundamentally built into the human psyche. Having others' attention and entertaining them with a good story is as strong a rush as there is. I've heard that the vast majority of criminals, when arrested, will simply confess because the urge to tell their story to a captive audience is just too strong.

This tendency manifests itself even when there is, quite literally, no story to tell. The clustering illusion denotes the human impulse to see significance in random patterns. Suppose a series of ten coin flips goes as follows: T, H, H, H, T, T, T, T, T, T. A lot of people (but hopefully not too many of my own readers) would see the coin as streaky, though how they would react to that perception might vary: Some might conclude that the coin was "due" for heads and bet that way, while others might conclude that it was on a "tails" streak and bet that way. (For what it's worth, I flipped a quarter ten times and that's exactly the way they came out.)


This has major implications for how we watch and remember sporting events. Maybe the most obvious example of this is the so-called "hot hand" in basketball: the idea that a shooter is "in the zone," and more likely than normal to hit any given shot. Various studies have looked for and failed to find evidence for the hot hand. It's entirely possible that the hot hand is wholly illusory, that it's just the clustering illusion in play. However, as Carl Sagan was wont to say, absence of evidence is not evidence of absence. Except for free throws, in which shot selection and defense have no play, shooting accuracy is highly contextual. Some shots are wide open, while others are tightly contested. They are shot from all over the field. Some are shot on the run, others are shot on the step back, while still others are spot up shots. What's more, players are intensely aware that they're hot, and as a result may shoot any hot hand they have in the foot (as it were). All these factors conspire to make the hot hand difficult indeed to discern. (For free throws, there is apparently a moderate hot hand; see this paper (or at least its abstract) by Jeremy Arkes.)

But a more basic example is in how we all remember and talk about the game afterward. We talk about the shooting struggles of such and such a player, and how (if our team won) he overcame that adversity and pushed through to get the win. We look back in our memory and find events that, although they seemed minor at the time, turned out to have momentous impact on the outcome of the game. Consider this account of Game 7 of the 2010 NBA Finals:
With 8:24 left in the third quarter, Celtics point guard Rajon Rondo picked up a loose rebound off Paul Pierce's miss from 19 feet, and pushed it back in to put the Celtics up 49-36. And through 28 minutes of play, Kobe Bryant had had an abysmally poor night on the offensive end. He had shot three of 17 from the field and one of three from the free throw line for seven points and a true shooting percentage of only 19 percent. Largely as a result of his terrible performance, the Lakers found themselves down by 13. To be sure, Bryant had eight rebounds (four of them on the offensive end), but that hardly put a dent in his overall play.

On the play, however, Pierce injured his shoulder and had to sit out for a spell. Bryant thought he saw something that he could exploit as a result, and went to work. On the very next play, he drove into the lane and drew a shooting foul on forward Rasheed Wallace. He only made one of his two free throws, but from then on his performance surged abruptly upward. Starting with that play and for the rest of the game, Bryant gathered seven more rebounds and shot three of seven from the field and 10 of 12 from the free throw line for 16 points and a true shooting percentage of 65 percent, leading his team to a 83-79 win for the title.
Sounds pretty interesting, doesn't it? Makes you wonder what it was that Kobe saw that he could take advantage of. I would wonder, too, except that I just now made it up. Everything else is true, but the sentence in bold is conjured out of whole cloth. Actually, Kobe simply tossed his hands in frustration for a second before taking the inbounds pass and dribbling it upcourt. In trying this narrative out on a couple of folks, though, I found that it was compelling because once people see the remarkable contrast between Kobe's play before that moment and his play after it, they assume that something equally remarkable must have happened to precipitate it. We will latch onto any little thing as an explanation, even if it had no more to do in fact with the game than any other little thing. Right place, right time.

As far as I can tell, though, there was nothing in that game that happened to Kobe. Aside from a trio of truly horrible shots that he took with the shot clock running down, his shot selection was not noticeably worse while the Lakers were falling behind than it was during their comeback. Sometimes, you know, a cigar really is just a cigar.

Monday, January 11, 2010

Cutting Your Losses

I was standing at the vending machine at work today, buying some chips with lots of small coins (nickels and dimes). And as I often do, I carefully inserted the nickels first, then the dimes; if I had used any quarters, they'd have come last.

You may—assuming you've read this far—wondered why this is. To be fair, having done this for a long time, I wondered myself for a moment. And then I remembered.

See, when I first started doing this, I was in college. I was living in the dorms. The dorms had vending machines, which were balky, much like anything in the dorms. They would, occasionally, find something objectionable about your change. They were even particular about the way you inserted your change; sometimes, it would take six or seven tries for you to get it to accept a specific dime. I would bring extra change just in case, if I had any, but sometimes even that would run out. So there I would be standing, with 45 cents that the machine was refusing to take, and more money back in the dorm room that I could try out on the Keeper of the Fizzies. But in order to get that money, I'd actually have to back to the dorm room. Away from the vending machine.

I'd run downstairs, get the change, run back upstairs, and hope that in the meantime, no dormitory Grinch had decided to get a 30-cent discount on his Coke.

Because, as it happens, sometimes they would. I'd get back and there would be no credit at all in the vending machine. You might suppose that Whoever It Was would at least leave the credit they had benefited from in change on the side, but noooooo.

That's when this business with inserting change in ascending order of value started. It was a way of cutting my losses. You might think that it would be simpler for me to just push the coin return and withdraw my change before heading downstairs, but in the first place, the coin return lever was balky, like everything else, and in the second place, it had often taken me lots of effort to get those coins in and I was reluctant to relinquish those hard-won gains.

Eventually, I managed to obtain a small dorm fridge and thereafter bought my drinks at the market. But this was before all that. Just the same, I continued my coin-sorting practice even to the present day, where (I daresay) my co-workers are far less likely to stiff me out of a handful of change than my dormmates were.

You know me, always looking for something mathy about the situation, so here's the question: Suppose that I only used n nickels and d dimes (no quarters), that I foolishly brought exact change, and that the vending machine refuses to take exactly one coin, randomly and uniformly selected from all the coins. On average, how much less money did I place at risk going nickels first than I did going dimes first?

The answer: The average reduction in risk was equal to the value of the nickels multiplied by the fraction of coins that were dimes.

I had thought to try to tie this story to something deeper, but I just can't bring myself to do it.

Wednesday, August 26, 2009

How Random is Random?

We all think that we know when something is random. But how random is random?

Part of the aim of mathematics is to unify concepts. It's what makes mathematics more than just a collection of ways to figure things out. As a side effect, though, mathematics definitions tend to be a bit counterintuitive. For example, I think we all know what the difference between a rectangle and a square is: A square has all four sides of equal length, and a rectangle doesn't.

Except that a mathematician says that squares are rectangles, because to a mathematician, it's inefficient and non-unifying to say that a rectangle is a four-sided figure with four right angles, except when all four sides have the same length. It makes more sense, from a mathematical perspective, to make squares a special case of rectangles.

So hopefully it won't come as too much of a surprise if I say that a completely deterministic process, such as flipping a coin that always comes up heads, is still considered a random process to mathematicians who study that sort of thing. So is a coin that comes up heads 90 percent of the time. Or 70 percent. Or—and maybe this is the surprise, now—50 percent. The cheat coin is simply a special case of a random process. To a mathematician, none of these processes is "more random" than the others. They just have different parameters.

What we think of as randomness, mathematicians call entropy. This is related to, but not the same thing as, the thermodynamic entropy that governs the direction of chemical reactions and is supposed to characterize the eventual fate of the universe. (Another post, another time, perhaps.) It turns out that this "information-theoretic" notion of entropy corresponds pretty well to what the rest of us call randomness. For those of you who are even the slightest bit curious, the definition of entropy for a flipped coin is

S = - ( pH lg pH + pT lg pT )

where pH and pT are the probabilities for heads and tails, respectively, and lg is logarithm to the base 2. For a 50-50 coin, the entropy is S = 1; for a completely deterministic coin (a two-headed one, for instance), the entropy is S = 0. For something in between—say, one that comes up heads 70 percent of the time—the entropy is something intermediate: in this case, S = 0.88 approximately.

So, all right, how entropic is a real coin? The answer is that it's probably less entropic—less random, that is—than you think it is, especially if you spin it. A paper by researchers from Stanford University and UC Santa Cruz (via Bruce Schneier, in turn via Coding the Wheel) has seven basic conclusions about coin flips:
  1. If the coin is tossed and caught, it has about a 51 percent chance of landing on the same face it was launched. (If it starts out as heads, for instance, there's a 51 percent chance it will end as heads.)
  2. If the coin is spun, rather than tossed, it can have a much larger than 50 percent chance of ending with the heavier side down. Spun coins can exhibit huge bias (some spun coins will fall tails up 80 percent of the time).
  3. If the coin is tossed and allowed to clatter to the floor, this probably adds randomness.
  4. If the coin is tossed and allowed to clatter to the floor where it spins, as will sometimes happen, the above spinning bias probably comes into play.
  5. A coin will land on its edge around 1 in 6000 throws.
  6. The same initial coin-flipping conditions produce the same coin flip result. That is, there's a certain amount of determinism to the coin flip.
  7. A more robust coin toss (more revolutions) decreases the bias.
Somewhat along the same lines, Ian Stewart, who for a while wrote a column on recreational mathematics for Scientific American, mentioned a study in one of his columns by an amateur mathematician (and professional journalist) named Robert Matthews. Matthews had watched a program in which the producers had asked people to toss buttered toast into the air, in a test of Murphy's Law as it applies to buttered toast. Somewhat to their surprise, the toast landed buttered side up about as often as it landed buttered side down.

Matthews decided that was not quite kosher. People, he thought, don't usually toss buttered toast into the air; they accidentally slide it off the plate or table. That ought to be taken into account when analyzing Murphy's Law of Buttered Toast. And when he did take it into account, he found something rather unusual. A process that you might have thought was fairly entropic turned out to be almost wholly deterministic, given some not-so-unusual assumptions about how fast the toast slides off the table. Unless you flick the toast off the table with significant speed, the buttered side lands face down almost all of the time. And it has nothing to do with the butter making that side heavier; it's that the rotation put on the toast as it creeps off the table is just enough to give it a half spin. Since the toast starts out buttered side up (one presumes), it ends up buttered side down. Stewart recommends that if you do see the toast beginning to slide off the table, and you can't catch it, to give it that fast flick, so that it isn't able to make a half flip, and lands buttered side up. You won't save the toast, unless you keep your floor fastidiously clean, but you might save yourself the mess of cleaning up the butter.

On the other hand, maybe there's another solution.