I was browsing the NFL website today, checking out the games for tomorrow. The Steelers are playing the Bengals, which as a predominantly Steelers household should be a good game for us. However, looking over the stats for the Steelers I ran across an example of a pet peeve of mine.
There is a little table showing their standing in the AFC North that shows their wins, losses, and win %. The Steelers have won 4, lost 1, but show a win % of 0.800. I don’t think that’s really what they mean. A value of 0.8% is a pretty small number of games to have won. What they really mean is 80%, but instead they’ve mixed up two different mathematical concepts.
The value 0.8 is the multiplier used to calculate how many games they’ve won out of the total number of games they’ve played. i.e. 5 games played * 0.8 = 4 games won. That is not the same thing as 0.8% which is what the table actually cites. If they’d won 0.8% of 5 games, then they would only have 0.04 of a game; or about 2 minutes and 24 seconds of game time. I know they’ve won more than that!
Pedantic? Yes. However, this pet peeve of mine crops up in more than just NFL statistics. Consistency and precision are key when it comes to defining concepts, especially mathematical concepts. Saying that the steelers have a 0.8% win record is a highly inaccurate statement.
Why is it important?
Consider for a moment that a fast food vendor put out an advert for:
Is that really what they mean? Each cheeseburger costs less than a penny? If so then surely people would walk up with a dollar bill, buy 100 cheeseburgers and then set up a food cart right next door selling them for a dollar a piece. Legally the advertisement would have to stand. People are purchasing the cheeseburgers for the advertised price of 0.99¢. What they really mean is either 99¢ OR $0.99 but you can’t mix the two together.
The NFL folks probably figured that 80 wasn’t an informative number for most people in that table. However, the number 0.800 as win ratio multiplier probably felt confusing too, so instead they just mixed their definitions to get the best of both worlds. While most people who read this will probably think “Oh let it go you pedantic git”, I think it is fundamental that we retain precision when describing quantities.
Percent means “out of 100”
Per – cent. Cent is a latin rooted word for 100. It’s where we get words like centurion, century, and centimeter. Per means “for each” or “for every”. Therefore percent means per hundred. Eighty percent means that per hundred games the Steelers have won eighty. If five games have been played then an 80% win rate dictates that 4 have been won so far. 100/80 == 5/4.
Does it really matter?
When the Hubble telescope was first launched there was a problem discovered with it’s mirror; it had been cut to the wrong shape. It was off by less than a millimeter but it was a lack of correctness in the terms and definitions used that led to the $1.5 billion problem.
If I was to place a bet on the Steelers to win tomorrow, and the odds for the bet were calculated based upon the Steelers having 0.8% chance to win (i.e. a very small chance to win indeed) then I could make a lot of money! The bookie would give me great odds on such a long-shot when in reality I know that the Steelers have a very good chance of crushing the Bengals. What if your bank calculated their odds with fuzzy math too? That wouldn’t work very well at all. 😉