Mark Twain is famously quoted as saying, “There are three types of lies: lies, damned lies, and statistics.”
Can we really trust statistics?
How many times have you seen on a TV or Web advertisement a promotional piece on a new drug or nutrition supplement that promises to reduce your risk of, say, Bubonic Plague, by 50%?
“Hey, Wilma, where’s my credit card? Write down that “1-800″ number quick! That new memory pill is on the TV! Where did I put my pen? Wilma, when are you gonna bring me that beer I asked for? And why did you turn off the game for this stupid info-mercial? Wilma? Wilma?”
There are all sorts of ways to analyze claims made by drugs, pills, vitamins, and gadgets whose makers trot out impressive-sounding statistics to make reach for your wallet. I will mention a couple of them, like, how many people were tested comparing the use of the product in some people versus non-use of the product in others (the number of subjects studied is called “n”)? Were the groups of test subjects switched around so the user group became non-users for a period of time and vice-versa (this is called a “crossover study”)? There are many more ways to test statistical reliability that require math that really bores me, personally. So here’s the best way:
First, if a product promises to reduce your risk of “such and such-itis” by, say, 50%, try to find out what your own actual risk of “such and such-itis” is, and what are the consequences to you if you fall victim to “such and such-itis”. Your doctor (that could be me) can help you with that. If your risk is one-in-ten-thousand you should feel soooo much better to know that you can reduce your risk to one-in-fifteen-thousand for five easy payments of $59.95!
On the other hand, if your risk is one-in-ten, I would pay attention to something that reduced risk by 50%… or even 30%. Unless the number studied to reach these statistics was twenty seven. Or the disease was hang-nails.
I’ll take my chances. Goin’ “all-in” on nothin’, thank-you.