❗Help close the gap: We still need to raise $40,000 by the end of March. Donate now
Elections 2020

What the USC Daybreak poll got right and wrong about 2020 election predictions

David Brancaccio and Daniel Shin Nov 19, 2020
Heard on:
HTML EMBED:
COPY
Social circle polling can be "like having a little window into the people that we may or may not have access to," says Jill Darling, survey director of the USC Daybreak poll. Sean Rayford/Getty Images
Elections 2020

What the USC Daybreak poll got right and wrong about 2020 election predictions

David Brancaccio and Daniel Shin Nov 19, 2020
Heard on:
Social circle polling can be "like having a little window into the people that we may or may not have access to," says Jill Darling, survey director of the USC Daybreak poll. Sean Rayford/Getty Images
HTML EMBED:
COPY

Pollsters with the University of Southern California’s Dornsife Daybreak presidential poll released Thursday results of their first post-election voter polls. And while the votes cast in the 2020 elections are still being counted and audited, these initial responses are helping pinpoint what could have gone wrong and right in pre-election polling predictions.

“Marketplace Morning Report” host David Brancaccio spoke with Jill Darling, survey director of the USC Daybreak poll, about what their post-election polling data suggests about polling methodologies, why the economy and not the COVID-19 pandemic was a top voter issue amongst some respondents and why one experimental subsection of the Daybreak Poll delivered a much more accurate prediction of the 2020 presidential election compared to other national polls.

Below is an edited transcript of their conversation.

David Brancaccio: So, many polls, ahead of the actual election, may have led some people astray. But that has not stopped your team. You’ve done a poll after the election so I guess you’re staying in this line of work?

Jill Darling: Yep. Survey research is what we do. We keep going and we find out what happened.

Brancaccio: Well, when people were asked, “Who do you plan to vote for?” there was a tendency for the resulting data to understate support for President [Donald] Trump, both in 2016 and again here in 2020. But there was an interesting flag about the final election outcome. It has to do with a question you were asking about social circles. Who’s that? Your pals, friends and relatives?

The “social circle” question: Asking people who they think their friends and family members are voting for

Darling: This is something that’s being studied by our USC colleague, Wändi Bruine de Bruin, and her colleagues from the University of Santa Fe and MIT. And it looks like this is going to be very close to the actual outcome of the election — closer than our probability question that we have also been asking over time, and also closer than the other national polls. I mean, we don’t have the outcomes yet. Votes are still being counted. So, you know, when the smoke clears, I think it’s got to be a little better than everybody thought. But nonetheless, it’s really intriguing that the social circle voting question performed very well this year, as it has in previous years when they have tested it, both here in the U..S and in elections in other countries.

So the social circles question asks voters to tell us what proportion of their social circle of family and friends are going to vote for each candidate. And there’s a couple of reasons why this method may be more accurate, and why it may have proved to be robust even in this very different election, which is taking place, of course, during a pandemic, in an extremely polarized electorate. I mean, it might help us indirectly reach people who are otherwise difficult to get to talk to us in a poll. They may not want to answer polling questions, they may not trust universities or the media or they may just be hard to reach as a group. Some people may not want to admit they’re voting for a very polarizing candidate, particularly if they’re in a situation maybe where their family and friends aren’t voting for them, or they have to admit it to somebody on the telephone, who they feel like might not approve of that. And it may be that it’s helping us to include votes from the 10% of voters that we see told us that they decided they’ll vote on Election Day. And people that we may have missed in our polling — not missed, but hadn’t yet made up their minds — in the week before the election as well. So these late-deciding voters might be influenced by the decisions made by their family and friends.

Brancaccio: So it’s not easy to get a hold of a person willing to respond to a pollster. So when you find one of these wonderful resources, someone willing to answer your questions, you ask about others around them. And one thing that might be happening is, they’re also acting as almost a reporter reporting back to the pollster what they’re seeing?

Darling: Yes, this is a part of sort of a larger way of thinking about kind of like [a] “Wisdom of Crowd” way of thinking about asking questions about things. And the team who was studying this has received a National Science Foundation grant this year to study this because it has been so effective in previous elections. And, you know, when we were coming up to the election, and every poll in the country that was polling the national vote — I’m not talking about state polls — was pretty much showing the same thing. And we had the social circle question much closer, you know, we started thinking about, well, you know, what could be kind of going wrong in this very different election year? And so it’s really fascinating that it has really maybe proven to be more robust, actually, even in this very different election year than traditional ways of asking the questions.

Brancaccio: I’m trying to imagine how I would have answered in my own social circles, because I would have said, “the people around me are very split.” That’s what I would have said, and that might have been a useful indicator.

Darling: Exactly. Yeah. And so you know, it’s kind of, like you said earlier, having a little window into the people that we may or may not have access to. So it expands our sample, if you can think of it that way. It’s not just one person, but it’s the size of their group. So it actually gives us a little bit larger view into how people may be voting. And, as I mentioned, I mean, there really may be some insight into how people react to what’s going on in their bubble.

The “shy voter” phenomenon

Brancaccio: But there’s little chance that it is a version of the shy voter phenomenon where someone thinks that their choice is somehow not socially acceptable? And they don’t quite dare to reveal that to the pollster, so they say, “well, I don’t mean me, but like, the people around me are voting for that guy.” There’s not a little bit of that?

Darling: Yeah. And it may be, you know, may be able to get it that as well, that not only expanding this to people that we may not have information from, but also, you know, sort of letting people actually say who it is that they’re really thinking about voting for. So that’s also a possibility. As I say, the team who’s studying this asked a lot of questions trying to get at kind of what the mechanisms are. They asked about whether the people that they know would lie to pollsters, and if they thought that there were a lot of people who weren’t admitting to voting for their candidate. And so they’re going to have quite a few ways of getting at this and really being able to have some definitive answers for us. I can’t wait to see the outcome of their analysis.

A polarized electorate; no evidence that people are lying on polls

Brancaccio: It does seem really fascinating. Now, I understand you’re still piecing all of this together. But what other possible sources of error in the polls would you want to know more about? I mean, understanding that we can’t speak definitively quite yet.

Darling: Right. So you know, the things that we have been investigating, and our investigation is still ongoing, but the things that we were looking at are, did our probability questions just not work as well in this very different election year? Did we feel like we might just be missing Trump voters? Is there this group of hidden Trump voters that we’re talking about, or, you know, hidden voters of any kind? Were people lying? We have a lot of data and a lot of information that we’ve collected. And so we have the ability to kind of look and see if there was consistency in our post-election poll where we find out how people actually did vote.

What we found in looking at this is that our probability questions actually did a very good job of predicting the overall high likelihood of voting and also actual votes for the candidates. This has been the case in previous years as well. But we’re seeing something really different this year: our pre-post analysis is showing us how deeply polarized the voters have become. So the way that we ask these questions, as we asked them to give us a prediction from 0 to 100 of their likelihood of voting for the candidates. It’s a little bit different than the traditional way that people ask the voting question. And what we’re seeing is that in this election, which is very different than in previous years, almost everyone was exactly 0 or exactly 100. What’s missing are the middle-ground voters, the ones who are possibly going to vote for one candidate or another, still making up their minds. This year, that just did not happen. If you look at our tracking graph this year, you know, it moved by a point or two. Huge events, you know, events that would have been huge in other years, just didn’t move anything.

So we also found no evidence that people are lying to us. We looked at how they answered questions about voting, we looked at their party affiliation, we looked at political ideology, we looked at how they voted in other elections. There’s just no significant inconsistency between what they’re telling us about voting in this election and, you know, all the other indicators that we have. So we’re discounting that and I really think that, in general, people like to say that, you know, people lie to polls and there are, I’m sure, a few who do. But we have never found that that has been a significant issue.

Why some people who said they would vote did not

Brancaccio: But you actually have data from after the election about, “you said you were going to vote, did you actually vote,” and there are, you know, a number of people, a percentage, who didn’t vote in the end, and you asked them why. What did you find?

Darling: Right so while we have a small percentage of likely voters who did not vote due to voter suppression tactics, such as being provided with bad information, not having a usable ID, being harassed or intimidated, not being able to get to a polling location, etc. This was a fairly small group, on average, overall. It’s quite a bit higher among some subgroups. But generally speaking, this did not add up to a large enough percentage of voters to change the outcome or even really shift the outcome of the election. So that’s another thing that we are able to discount. And then there are the people who made up their minds about who to vote for on Election Day. They voted for Trump by four points. And the candidates tied among people who made up their minds within the weeks before the election. That is large enough potential change to shift if we had that in our pre-election estimates. That would have shifted things a little bit, not enough to actually probably get us to within as close as we would have liked to be to the actual outcome.

Brancaccio: And some people who were going to vote and didn’t vote, it’s just they didn’t have an explanation. They just said, I don’t know.

Darling: Yeah, yeah, I mean, you know, generally speaking, there’s always a group of people. We ask this every year, I added the voter suppression stuff this year, but we ask every year why people didn’t vote. And people have a lot of reasons: They missed the deadline to get registered, they forgot it was the day, you know, they got busy, they were prevented. We have a group of people who just chose not to vote. Mostly those are very low propensity voters to start with, but the people who intended to and don’t, generally speaking, it’s because something happened. And that was the higher proportion this year as well.

Voters’ top issues: COVID-19, the economy and more

Brancaccio: Now, in the post-election survey that you’ve done, you did ask people, what was the big issue for you when you went to vote? I’m thinking given everything, maybe pandemic. What did you actually find?

Darling: Right, what we found was that overall, the highest percentage, about 25%, named economy and jobs as their first issue in making up their minds who to vote for, and that was followed by the integrity and honesty of their candidate and protecting democracy. There was, of course, a very wide partisan divide in this. And if you look at the sort of the top three, for Biden voters, handling of COVID-19 came up as the highest. Fifty-three percent named that as one of their top three issues, followed by candidate integrity and uniting the country. And for Trump voters, it was overwhelmingly jobs and the economy. Seventy-five percent named that a one of their top three. followed by protecting democracy and law enforcement.

Brancaccio: I didn’t hear the pandemic in the top four list for Trump supporters.

Darling: It wasn’t. No, for Trump supporters, the biggest issues are right out of the campaign’s playbook. They emphasized very much Trump would be handling the economy better and what would happen under a Biden presidency. And so and I think you see that here, whereas of course, Biden’s campaign focused on Trump’s handling of the pandemic.

Brancaccio: Your team got a reputation in 2016 as getting it less wrong than others in terms of Trump winning. How did you do this time, do you think, in the end?

Darling: Well, I would say that we did better this year than we did in 2016. Because even though we did have the outcome for Trump at the time, we were measuring the popular vote. And, as you know, Hillary Clinton won the popular vote. So this year, we’re happy to have — and actually, immediately after the election, we realized we had a high proportion of rural voters and were able to correct that, and we were very happy with the way that the poll performed in 2018. This year, once again, I can see that our methods, our somewhat experimental methods, are working very well. The question is now, how is it that we are not getting enough information from people who are on the right? So I think that that’s something that we along with all the other pollsters who were looking at the election this year are going to be looking into.

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.