Wednesday 30 March 2011

Watching the "Horse Race": Polling During an Election

It's often called the "horse race": the leaders of the political parties are the horses, and we are the spectators. The increase in the number of public opinion polls being undertaken and reported on during an election campaign is supposed to show us how the leaders and the parties are doing with potential voters. However, do the polls really accomplish this? Are all polls equal? Is this really a useful thing for us as citizens to pay attention to during an election?

Leading pollsters recently engaged in a very public argument about whether or not public opinion polling really matters anymore for election campaigns - and particularly in horserace situations. Joan Bryden of the Canadian Press did an excellent series of articles (start here) that examined the state of polling in federal elections. Dr. Andre Turcotte from Carleton University (and former pollster for the Reform Party) argues that methodological issues with public opinion polls have become so common that political parties are using more nuanced tools that they can use to gather information (or “market intelligence) about how people may vote, what messaging might influence them in the voting process, and to convey information that the parties (or their supporters) want citizens to have. This makes public opinion polling irrelevant for most. This opinion is shared by other well-respected pollsters, such as Frank Graves of EKOS. However, Ipsos Reid’s John Wright and Darrell Bricker suggest that polling still provides valuable information for political parties and for citizens, as long as the information is reported accurately in the media.

Let's focus on who participates in polls as a starting point. Methods of gathering public opinion - including polling - have been criticized for as long as they have been used. In order for a poll to be scientifically valid, we rely on the people selected to take part in the poll to be representative of people in the country or area to be surveyed and we expect that the people who participate are randomly selected. This means that each person of a target population (for example, the population of Manitoba residents aged 18 and older) should have an equal chance (and only one chance) to be selected to participate. Ensuring that we have that accurate sampling of people in our target population has become more and more difficult.

For example, in 21st century Canada, telephone-based polling has been challenged by a number of developments. First, is the growth of cell phone use, which limits how many people can be reached through the phone while also increasing the number of people who may have both a land line and a cell phone, and those who have only a cell phone. We have seen a decline in response rates - the proportion of people who will participate in a survey or poll - because fewer people will answer the phone to participate in a poll. Third, the exponential growth of online polling -which usually involves pools of people who are willing to participate in a poll without necessarily being representative of the entire population - means that we cannot use traditional methods of calculating whether or not who is interviewed really represents the population.

This means that the people who participate in polls - who provide their opinions to pollsters - often vary. We are more likely to see older people, particularly women, participating in telephone polls. Just think of who you know that won't answer the phone for a survey researcher! And then think about who you know - usually younger people - who do not have landlines, but instead have only cell phones. They are very unlikely to be part of a telephone poll. In online polls, pollsters note that we're more likely to see younger, professional, and well-educated people represented in most online communities.

Having said that, there are strategies that pollsters can use to ensure that the information they provide is solid. Less frequent polls - rather than the ones done every day - can shed some very interesting light on what members of the public are thinking. Focusing on one specific area - such as a province - can also help. For those of us interested in Manitoba's involvement in the election, a recent poll done by Probe Research shows us how 1,000 Manitobans think about federal (and provincial) politics. In national-level polls, Manitobans are often grouped with Saskatchewan people because of the relatively small populations in both provinces, so this is a helpful way to focus in on Manitobans and their opinions.

In addition to the responsibilities of polling companies, members of the media have a tremendous responsibility to ensure that they are reporting the information from public opinion polls accurately and not creating a story where there isn't one. As consumers (and creators) of this information, we also have the responsibility to ensure that we understand what is being reported. I would also argue that if we want better information, we should make an effort to take part in telephone or online polls - particularly related to social, political, and economic issues - when we're called to do so.

For more information on methodological issues related to polling, read Alice Funke's piece at the Pundit's Guide and Matthew Mendelsohn and Jason Brent's piece, Understanding Polling Methodology.

Any comments? Do you answer the phone when you know that a survey research company is calling? Do you participate in online polls? Why or why not?

1 comment:

  1. Brian Topp offers an interesting (sarcastic) take on the ups-and-downs of national polls:
    http://www.theglobeandmail.com/news/politics/second-reading/brian-topp/jack-laytons-orange-wave-begins/article1967539/?utm_medium=Feeds%3A%20RSS%2FAtom&utm_source=Politics&utm_content=1967539

    ReplyDelete