Support for the voice to parliament stands at about 41% and falling, according to Guardian Australia’s poll tracker. Yes supporters might hope the polls are wrong, but they would need to be off by 10% or more for the yes vote to achieve the double majority necessary for the referendum to pass.
That would be an error almost three times larger than what occurred in the 2019 federal election. Such a polling error is possible, but it’s very unlikely and there is nothing to suggest it, according to pollsters and election analysts who spoke to Guardian Australia.
So, how does polling work? We asked the experts to explain.
What went wrong in 2019?
Ahead of the 2019 election, some election watchers had warned that the different polls were unusually clustered, showing remarkably similar results.
“Every single poll that came out was saying exactly the same thing,” says election analyst William Bowe. So far, there aren’t similar concerns about the upcoming referendum.
“I can’t think of any farsighted poll watcher who’s come up with any theory this time [for] why the polls might be skewing towards no,” says Bowe.
Polls around the world actually tend to over-estimate support for referendums. This was also the case for in the same-sex marriage plebiscite – polls overestimated the yes vote and the average error was about 7 points.
This may be because polling for a referendum is more complex than for normal elections. Among other things, voters might feel less informed about what they are voting on. And there is still a large undecided population that may swing behind yes.
How does modern political polling works?
Most political polls work by trying to get 1,000-1,500 responses from a representative sample of the population. Pollsters try to match their sample against Australia’s voting population on multiple demographic factors, such as age, sex, income and location. After the 2019 polling failure, many pollsters also try to match on education levels.
“You can get, on average, quite close to the properties of a whole population by taking a sample of 1,000 or so,” says election analyst Dr Kevin Bonham. “People can do this experiment for themselves if they like by tossing coins and keeping track of the cumulative total and noticing how the [percentages of heads] converges on 50%.
“All else being equal, and assuming that there is a perfect sample, 1,000 respondents is most of the time going to give you the right answer within a few percent.”
Are landline phones a problem?
In the past, pollsters used to contact people via telephone. But all the public polls for the voice have been conducted entirely online. Polling companies have large “panels” with tens or hundreds of thousands of members who signed up themselves as part of loyalty programs or were invited.
“It’s much, much easier with online panel polling to know enough about your respondent to know who you want to invite for a given survey,” says Bonham. “In the old days of random phone polling, you’d be ringing and saying things like: ‘Can I speak to the youngest person in the house or the person who had the last birthday?’”
Armed with demographic information, pollsters will send out emails containing survey questions over a few days. They can exclude people who have recently answered surveys and keep sending out batches until they have filled their quota of younger voters, Western Australians and so on.
If there aren’t enough respondents from a certain group, then the pollster can “upweight” the responses of those they do have. It’s in these details that much of the differences between the pollsters arise.
How much variation is there?
“When we run a survey we make a series of assumptions because we’re trying to get an estimate of public opinion,” says Dr Shaun Ratcliff, who has worked with pollster Redbridge on voice polls.
“So you’re working out, you know, what is the best method I can use to get the most representative sample of the electorate I can get. And then what’s the best question, or set of questions, I can ask them to elicit a response that will most closely match the decision the voter makes when they’re in the booth.
“And there’s no perfect way to do that.”
Before the official wording of the referendum question was released in March, Bowe says there was a lot of variation in the questions that pollsters were asking. But most pollsters have switched to using the exact referendum wording.
Some difference remain, but there has been a lot of convergence: since mid-July, every poll in Guardian Australia’s tracker has shown the no vote in the lead. There is about an eight percentage point difference between the highest and lowest polls released over the past week, but Ratcliff says these kinds of differences are to be expected.
“If you had five surveys run in the same period, with the same question, same quotas, the same weights, you’d get a couple of percent variation,” he says.