Canada's Federal Election 2015: The Official Thread

Locutus

Adorable Deplorable
Jun 18, 2007
32,230
47
48
66
Manny Montenegrino ‏@manny_ottawa

EKOS poll:

Quebec Liberals 29% (big drop)
NDP at 24% (big drop)
Conservatives at 23%

Election From 2011 to 2015



 

mentalfloss

Prickly Curmudgeon Smiter
Jun 28, 2010
39,814
467
83
Educate yourself instead of just posting your twitter feed.

The poll that many of us were waiting for finally arrived yesterday, as La Presse published the latest numbers from Quebec-based pollster CROP. The poll would settle once and for all the question we've been asking for weeks: are the Conservatives really making inroads in Quebec?

The answer the poll provided was: maybe, but certainly not to the same extent as we've seen in other surveys. But that is a boring answer. The answer a lot of people saw in this instead was: "welp, so much for that idea."

I think that is a very simplistic way to look at the results of this poll. I'll get into that in more detail below, but first let's just take a look at the overall numbers.


CROP was last in the field between Dec. 10-15. They have not recorded any statistically significant shifts in support since then (at least, if these samples were probabilistic).

The Liberals dropped four points to 33% in Quebec, while the New Democrats were unchanged at 30%.

The Bloc Québécois was also steady, at 17%. The Conservatives picked up three points to hit 16%, and the Greens were unchanged at 4%.

The Liberals have been wobbling back and forth in CROP's polling for some time, as shown in the last six polls from the company: 38%, 34%, 37%, 32%, 37%, 33%. This recent drop would seem to fit into that oscillation.

But what about the Conservatives?

The increase of three points is within the margin of error (or would be, of similarly sized probabilistic samples), so it could just be a statistical fluke. I think, however, that with the gains we have seen in other polls it stands to reason that it isn't a statistical fluke. In CROP's previous 10 polls, for instance, the Conservatives averaged just 13%.

And the Tories made gains throughout the province. They were up among francophones and non-francophones, made a significant increase on the island of Montreal, and inched upwards in Quebec City and the regions of Quebec. Only in the 'couronne', where they fell by a single point, did the party take a step backwards. And 15% of respondents picked Stephen Harper as the preferred person to be prime minister, his best result in a CROP poll since at least June 2013.

Nevertheless, that 16% is well below the current aggregate of 21%, and the 23% to 26% we've seen in the last two polls by EKOS. That would seem to settle it, then, yes? Much ado about nothing in Quebec. A large sample poll from a Quebec-based company showing a marginal increase is a very strong argument against Conservative gains in the province.

But I don't think we should stop there. A lot of people have accepted this CROP poll as the be-all-and-end-all of polling Quebec. I think that is a bit much. CROP is a good pollster. But they are far from the decisive and conclusive voice.

Consider that CROP hasn't been tested all that much in recent elections. In the 2011 federal election, the final CROP poll exited the field on April 20, almost two weeks before Election Day. In the 2014 provincial election, CROP was out of the field on March 16, more than three weeks before Election Day. In recent contests, only in 2012 did CROP put out a late campaign poll. But it was one conducted over the telephone, which tells us little about the accuracy of their online panel.

Much is also made of the large sample size. In this poll, CROP gathered the opinions of 884 decided voters. A standard national poll would have less than 250 responses from the province.

But the last three Forum and EKOS polls that have shown the Conservative gains in Quebec sampled a total of 1,037 decided voters. So sample size is not the issue. The reliability of the sample is another question entirely, however. Whether it be some 900 panelists or some 1,000 Quebecers willing to take an automated telephone survey, if the sample itself is of poor quality it doesn't matter how many people are interviewed.

And this brings up another interesting question. Is there a methodological difference in the results? Since the beginning of 2015, the Conservatives have averaged 23% support in Quebec in six IVR polls. In four online polls, they have averaged just 17%. Could this be a sort of 'shy Tory' effect in Quebec, an issue Joël-Denis Bellavance mentioned last week on CTV's PowerPlay? Is there a reason that online panels would be influenced by that while IVR polls would not be? In 2012, the closest thing to that would have been a 'shy Charest' effect, and the online polls did post lower Liberal numbers than Forum's IVR surveys.

All of this is not to say that CROP's results should be discounted - far from it. But it should make you ponder whether one poll really has such a monopoly on the truth. Instead, each poll adds to what we know, and CROP gives us a very good piece of information. It tells us Forum and EKOS have probably been too high for the Tories. But CROP might still be too low. Methodological influences and house effects can be very important.

Take, for example, the last time CROP was in the field in December. At around the same time, six other polls had been conducted. Look at the differences between the consensus of those six polls (the average) and CROP's findings:


There was little real difference that couldn't be explained by normal sampling error for the New Democrats, Bloc Québécois, and Greens. CROP was high on the Liberals, however, and low on the Conservatives.

Does this mean the exact same thing could be happening with this latest poll? Not necessarily. And does it mean that CROP is wrong while the others are right? Again, no. What it does mean is that each pollster has what is called a 'house effect', a methodological bias caused by any number of sources: mode of contact, the people sampled, the questions asked, the weightings applied, etc.

This is the benefit of using an aggregate of polls. It can iron out these differences, and get us closer to what might be the truth. CROP's polls are very valuable for their large samples, regional breakdowns, and local knowledge. But that doesn't mean other polls are clueless - in fact, the non-Quebec-based pollsters did very well in Quebec in 2011.

The regional breakdown

Let's get back to the poll itself.

The New Democrats held on to the lead among francophones with 32%, narrowly ahead of the Liberals at 29%. The Bloc was at 20%, while the Conservatives were at 15%. Note, though, that for the NDP they are polling quite a bit lower than the 35% to 39% recorded by CROP in polls done between June and November of last year.

Among non-francophones, the Liberals had a 13-point drop to 49%, with the NDP steady at 19% and the Conservatives at 18%. The Bloc made a big jump of 10 points to 10%, suggesting that in this case we may be looking at a statistical anomaly since it seems unlikely that about 1 in 6 non-francophones surveyed in December decided to de-camp from the Liberals and head to the Bloc.

The Liberals led in and around Montreal, with 34% on the island and 44% around it. The New Democrats were not far behind in Montreal with 31%, though they dropped eight points to 24% in the 'couronne'. The Conservatives placed third on the island of Montreal with a jump of 11 points to 18%, but were in fourth behind the Bloc (20%) with 9% in the suburbs.

The Conservatives held the lead in Quebec City, though, with 38% support. That was virtually unchanged from December, and the Tories have not polled so highly here in two consecutive CROP surveys since the beginning of 2012. The NDP was at 33%, while the Liberals were down to 16% support.

In the rest of the province, the NDP was narrowly ahead with 31% as the Liberals dropped to 30%. The Bloc was steady at 20%, while the Conservatives were at 14%.

On who would make the best prime minister, Thomas Mulcair was in front with 25%. Justin Trudeau experienced a big tumble, dropping five points to 23%. As mentioned, Harper was up to 15%, a three-point increase.

If we ignored what other surveys have been saying, how would we look at this poll? For the most part, we'd consider it par for the course. These are the sorts of numbers CROP has been putting out for months. A close race between the NDP and Liberals, but with the NDP ahead among francophones. The Conservatives up slightly, which we'd consider just a wobble, but good numbers in Quebec City. The Bloc continuing to flounder.

Overall, we'd probably consider it a decent poll for all three federalist parties. The Liberals still lead, and look well-positioned for gains in and around Montreal (the drop they experienced on the island was due to those odd results among non-francophones, which are sure to be reset with the next poll). The NDP still leads among francophones, and so should retain the bulk of their seats. The Conservatives could make big gains in Quebec City.

But in light of other polls, we can look at this in two ways. The first is to consider that the Conservatives may not be making the gains other polls have suggested they are, and that their hopes need to be tempered. The second is to see in this the same trends that other polls have recorded, and that Quebec will indeed be a battleground for all three national parties. As usual, time will tell.


ThreeHundredEight.com
 

Locutus

Adorable Deplorable
Jun 18, 2007
32,230
47
48
66
Educate yourself instead of just posting your twitter feed.

you get awfully testy when someone posts anything contrary to your official declarations kid...anyway, the poll was/is valid as posted...and hey, don't hate on twitter...it's just another dissemination of information tool...kinda like you. :lol:

aww shucks, I'm just joshin' ya again boy...or am I?


naww, I am...don't fret. you're basically a harmless good seed. hey now. *manly bear hug*
 

taxslave

Hall of Fame Member
Nov 25, 2008
36,362
4,340
113
Vancouver Island
So based on the responses of 250 people they think they can predict the outcome of an entire province? I can do better than that with chicken bones.
 

gore0bsessed

Time Out
Oct 23, 2011
2,414
0
36
What's this horrible pile of **** excuse of an anti-terror bill are these conservative half-wits trying to pass through?
 

mentalfloss

Prickly Curmudgeon Smiter
Jun 28, 2010
39,814
467
83
What do those numbers even mean? A voter’s guide to political polling in this 2015 federal election year

What do those numbers even mean? A voter’s guide to political polling in this 2015 federal election year
OTTAWA — Public opinion surveys are everywhere.

Mainlined by news and politics junkies, polls are dismissed and disparaged even as they’re obsessed over and dissected. They are the wallpaper of our election cycles and, arguably, the thoroughfares that help guide public discourse.

As Canadians prepare to cast a ballot in a 2015 federal election, competing voter-preference polls will be peppering the airwaves, each claiming to be a representative snapshot of Canadian public opinion.

Here’s a look at how those competing surveys come to life.

Whose opinion is it, anyway?

There are two main avenues pollsters use to reach Canadians, the telephone and the Internet. Which route you take, and how you drive it, will influence who responds to your survey.

Live telephone calls, in which an interviewer walks respondents through a series of questions, remain the “gold standard” of polling, says Paul Adams, a former political reporter and pollster who now teaches journalism at Carleton University in Ottawa.

Live calls, however, are also time-consuming and expensive, making them increasingly rare — at least when it comes to the “horse race” numbers in the news.

The rise of caller ID, call screening and cellphones has also tarnished the old gold standard.

It used to be pollsters could expect about a 20 per cent response rate to live calls, says Adams, meaning 20 poll respondents for every 100 calls made. Surveys done for the federal government, which are posted publicly with complete methodology, show response rates for some large national telephone polls as low as eight per cent.

Another factor: who’s picking up the phone? A single mother with three kids is less likely to have time for a phone survey than a retiree. What about the unemployed versus an executive working 60-hour weeks?

Nonetheless, randomized live phone polls “produce strikingly accurate results — even when response rates for those surveys are as low as 10 or 20 per cent,” says Jon Kresnick, a U.S. expert on polling and director of Stanford University’s political psychology research group.

The other method of telephone survey is known as IVR — Interactive Voice Response.

A recorded, automated call is more likely to get a hang-up than a live caller, and may be able to ask fewer questions once it gets a respondent before wearing out the person’s patience. But for a bare fraction of the price per call, pollsters using IVR can make tens of thousands of calls and build response samples significantly larger than live surveys.

Land lines or mobile?

Regardless of whether using live calls or IVR, if a polling company isn’t dialling cellphone numbers as well as land lines it is missing out on a fully representative sample. Increasing numbers of Canadians, especially younger people, only have mobile phones.

Pollsters use an algorithm to randomly dial both land lines and cellphones — “There are actually no telephone lists that are required,” says Nik Nanos of Nanos Research — although cellphone users, who pay for air time, are usually less responsive.

The proportion of cell respondents included in any telephone survey is up to the polling company; Nanos Research usually includes 25 per cent.

Casting a wider Net?

The other main route to survey respondents is the Internet.

Access to high-speed broadband is limited in many rural or remote regions of the country, and there remains an income, education and generational divide among the web savvy, says Adams, which suggests web-based polls have sample limitations.

There are also very different ways of getting respondents to complete an online survey: Self-selected panels and random recruitment.

Most of the companies doing Internet surveys claim that they can do sleight of hand with statistics to fix the fact that they don’t actually have random samples

Most online pollsters create large pools or panels of potential respondents, using an “opt-in” method such as offering entry in a prize lottery to those who sign up.

Online news readers last week might have seen a banner ad asking,

“Will you vote Trudeau next time?” The ad went on to say polling firm Angus Reid wants to know, and asked respondents to sign up for a survey.

Pollsters may pad their numbers by buying lists of email addresses — from retailers, for instance. From a panel that might include more than 100,000 potential respondents, the polling company then selects a much smaller subset of respondents for any given survey.

By opting in, panellists may provide pollsters with a detailed profile (such as age, gender, income, employment, locale, and even banking and shopping preferences) that can then be used to build samples for particular surveys. This is a great tool for doing targeted market research but is problematic when trying to survey public opinion at large.

“Most of the companies doing Internet surveys claim that they can do sleight of hand with statistics to fix the fact that they don’t actually have random samples. And they can’t,” says Kresnick. “There’s actually no way to do it.”

Because of the self-selected nature of the panel itself, surveys based on opt-in pools are not supposed to provide margins of error, according to the Marketing Research and Intelligence Association, which represents the industry in Canada.

Randomly recruited panels, by contrast, may use a variety of methods to get a proper probability sample of respondents to online surveys, including randomized IVR phone calls as the initial point of contact.

Truly randomized online surveys are considered comparable in quality to live telephone polls, says Kresnick. In fact, studies have showed people answering questions online are more likely to pause, reflect and respond more accurately than on the telephone.

Raw data, now what?

All pollsters “weight” their sample data in an effort make it reflective of the public at large. The simplest example is male-female balance. Canada’s population is 50.4 per cent female, according to census data. A national poll with more men than women would have to be weighted accordingly.

Pollsters routinely weight for age (younger people tend to be undersampled), gender and region but the more variables in the weighting mix, the more murky the poll. Online surveys tend to over-represent younger, more urban, wealthier and more educated people, which also requires weighting.

“The target for any researcher is not to weight, or to do the minimal amount of weighting,” says Nanos.

An increasingly troublesome issue for political polls is relating broad public opinion to the much narrower electorate.

With anywhere from 40 to 60 per cent of eligible voters failing to cast a ballot in any given election, pollsters must attempt to divine whose party preferences actually matter on election day.

Who pays the piper?

Most media polls these days, especially before the election campaign starts, are done for free by polling companies as a promotional or marketing exercise. The days of wealthy media companies paying for political polling are almost gone — which may, in part, explain the shift toward lower cost methodologies and less reliable results.


"A lot of it is like asking the dog to fetch the stick you're going to beat it with," pollster Frank Graves of Ekos Research says of offering free political surveys. THE CANADIAN PRESS/Cole Burston

What do those numbers even mean? A voter’s guide to political polling in this 2015 federal election year
 

mentalfloss

Prickly Curmudgeon Smiter
Jun 28, 2010
39,814
467
83
You're embarrassed about Trudeau's remarkable, if not legendary stupidity, aren't you?

Clearly this is the sole reason that you make a vain attempt to deflect the topic


For someone who believes that the government is not to intervene in economic affairs, you actually agree with him.
 

mentalfloss

Prickly Curmudgeon Smiter
Jun 28, 2010
39,814
467
83
It's logic.

If you believe the economy can take care of itself then it doesn't matter what type of government is on power.