Saturday, May 12, 2012

Things about political polling that occur to me



By Richard K. Barry

I'm no expert on political polling. I'm very interested in it, but there is a science to it that I don't know a lot about. Polling results are reported widely in the media. They are examined and discussed constantly. I pay attention.

In a common sense sort of way, there are a few things to know about polls. In no particular order, and certainly not meant to be comprehensive, here are a few random thoughts: 

  • Reputable polling companies have a vested interest in getting it right. There is nothing more important to a polling company than accurately predicting a race's outcome, nothing more embarrassing than getting it really wrong. 
  • Some polling companies are in some sense "aligned" with political parties. Sometimes this just means that the principals in the company have had prior affiliations. Sometimes it means that the company does a lot of paid work for one party or the other. Sometimes it's more unseemly. This doesn't necessarily mean that their results can't be trusted, only that if you start to notice that certain results are not consistent with what other polls are finding, you might ask yourself why that is. 
  • On the previous point, while it is important for polling companies to get it right, if they do happen to favour one party over another, there are all sorts or ways to intentionally get it wrong in the short-term if that helps the candidate they favour. Good reasons to "find" that a candidate is doing better than "expected" might be that it raises their profile, and helps them generate momentum in their campaign. 
  • As to how results in polls can be manipulated, exactly how a question is asked in a survey and what kinds of questions are asked prior to the main question gauging support for the candidate can be very important. For example, if you ask a person how important experience in running a business is to running the economy, then ask whether Romney or Obama would be better at managing the economy, you might get a skewed reply. If you ask how important it is that a president be able to empathize with the struggling middle class as a leading-in question, you might get a different answer. That is simplistic, but you get the point. 
  • If a polling company did want to skew data to help one side or the other, they would likely do that earlier in the race, well before election day. If they are reputable at all, they are going to want to get the final results right just like everyone else. In great part, future business might depend on it. 
  • Sample size is important in a poll. The more people who are surveyed, the more likely the results are to be correct. This lowers what is called the margin of error (MoE).
  • Precisely who was polled is key. If you are polling people who say they are very likely to vote, that is important. If they are registered voters, this is also important information. People can say whatever they like, if they don't vote, it doesn't matter. 
  • Something called "cross tabs" are important. I loved the Wiki definition of this: "Cross tabulation is the process of creating a contingency from the multivariate frequency distribution of statistical variables." Okay. In essence, this refers to a process of mixing and matching discrete variable in a survey, for example, how are younger voters in the south likely to vote, or, how are women with lower incomes, who live in swing states likely to vote? A problem with this is to make sure you have a statistically significant number of respondents for this kind of analysis, but it is key information for campaign managers. 
  • It's always good to know when a survey was taken, as opposed to when the results were published. Sometimes results are driven by key events, like a bad jobs report or an important presidential announcement. Some of these events may be important over time. Some will fade in significance fairly quickly. 
  • The phenomenon of an "outlier" poll is interesting. As scientific as pollsters like to claim their work is, sometimes a poll comes out that is completely at odds with what everyone else is finding. It happens. These are called outliers. It's also called getting it wrong. It's always better if available polling results confirm each other. 
  • There is something called a tracking poll, which generally uses a smaller sample size. It's not so much about capturing a snapshot of how a given candidate is doing at any given moment with a high degree of statistical accuracy, but about identifying trends. If I speak to 500 different people every night for months, I can learn something about how things are trending, but no particular night's results may be very accurate. 
  • The whole question of polling methodology is really the technical side of things. How do pollsters decide who to call, how do they generate a random sample? Do they used pre-organized panels of respondents? Some pollsters have staff who speak to live bodies. Some use automated systems that ask you to press "1" for a given reply. Some analysts say that certain groups can be underrepresented in a polling sample for very simple reasons. For example, more and more people are using cell phones as their primary phone. Younger people tend to do this. If the pollster is not using cell numbers, are his results going to be skewed? Very likely. Some people are just harder to survey, for many different reasons. Then there is the point about fewer and fewer people wanting to answer surveys and the ones who do maybe having a certain profile, i.e, perhaps more engaged. How does all this effect the results of a poll? Experts spend a lot of time discussing and worrying about these kinds of things. 
  • There is something called "push-polling," which isn't really polling at all, but campaigning by other means. If you get a call claiming to be a political survey and you are asked a question that sounds something like, "If you knew that Candidate X thought it was okay to provide free heroin to pre-schoolers, would that make you more or less likely to vote for him?" The caller never said that this is what Candidate X is proposing, but the impact on the prospective voter is clear.
  • I should also say that polls can only reflect what people are actually thinking. Early in a contest when voters are still making up their minds, it's common to see a lot of volatility in polling results. As we get closer to election day, polls begin to settle because voters are starting to make up their minds. When we say it's early days in a race, that's because you can't capture what isn't there.
  • Bottom line is that numbers are always interesting, as long as you give some thought to what you are looking at. 
As I say, this is not meant to be comprehensive, only stuff you might think about when you see polling numbers reported in the press. It's just meant as a bit of a primer on things to be aware of. If I got anything wrong or incomplete, let me know. 

By the way, I was driven to do this because I was looking at the aggregated polling data at RealClearPolitics for Obama vs. Romney. You can find that here. Nothing that I said above is intended to reflect on any of the polls listed there. I don't know enough about it.

(Cross-posted at Lippmann's Ghost.)

Labels:

Bookmark and Share

0 Comments:

Post a Comment

<< Home