Tuesday, December 7, 2010

Adventures in Daily Tracking

Want to read a news story that never happened?

Gallup, yesterday, under a headline claiming "U.S. Economic Confidence Down Last Week"
Americans became more dispirited about the state of the U.S. economy last week, with Gallup's Economic Confidence Index dropping to -30 for the seven days ending Dec. 5, the lowest weekly reading since early October.
I confess: I check on Gallup's presidential approval rating tracking poll basically every day, and a lot closer to when it's released (1PM Eastern) than I'm proud of.  But really: it's foolish to look at single poll results.  Outliers are, as the statistics folks will tell you, expected: one out of every twenty polls or so should be outside the margin of error, and with hundreds of polls out there on all sorts of things, we should expect quite a few very goofy ones every year.

So what happened here?  Gallup runs a three-day tracking poll on several subjects, including two which they combine into an index of "economic confidence."  It's been running relatively better lately, in the upper portion of its one-year range.  But then...well, here's the sequence, beginning with the poll covering November 24-26: -22, -23, -20, -22, -25, -34, -36, -34, -27, -25.  In other words, economic confidence appears to have collapsed (a 16 point swing is pretty large on their scale), and then rapidly rebounded. 

What happened is pretty obvious.  Take enough polls, and you'll get some odd ones, and apparently the numbers that came back for three days last week were, almost certainly by chance, really pessimistic ones.  Once those calls washed out of the system, things returned to normal.  It was just a fluke; by random chance, they happened to reach people who are far more down on the economy than their demographically similar peers. 

(No, I can't prove it.  It's the most likely answer, but it's possible, of course, that something happened last week that suddenly plunged America into economic despair but was gone by the weekend, and was not reflected in the events in the news.  Sure).

One of the things that I've always found it hard for people to believe is that polling really does "work" -- in the sense that pollsters can ask an alarmingly small number of Americans a question and successfully extrapolate out to what all Americans would have said were they asked the same question.  But part of the same logic that says that such operations usually works also says that sometimes, just from random chance, you'll get a strange number.   So the second caution I'd give everyone is to ignore (as much as you can, and if you're a political junkie like me that's not easy) individual results and instead pay attention to a good index of all polling. 

Oh, and the first caution?  Remember: all the pollsters can tell us is that their number represents how everyone would answer that particular question.  The relationship between "how people would answer that exact question" and "what people think" is in all cases open to interpretation, and there are plenty of cases in which one has very little at all to do with the other.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.

Who links to my website?