Key economic indicators that help experts gauge the health of the U.S. economy often rely on survey responses from businesses. But if response rates falter, does that economic data begin to risk its validity? Well, since the pandemic, responses to the Job Openings and Labor Turnover Survey by the Bureau of Labor Statistics have fallen — and that’s making economists worried.
Reade Pickert, an economic reporter for Bloomberg, reported on these declining response rates. She spoke with Marketplace host Kai Ryssdal about the consequences of this downward trend and how the government is addressing the problem to maintain the validity of this economic data.
The following is an edited transcript of the conversation.
Kai Ryssdal: So let’s dig in. Before we get to the why it matters part — which is the really important part — I want you to talk to me about the JOLTS survey, the job opening labor turnover survey that we talked about here on this program all the time. It’s been in the news a lot because of all the job openings that are out there. Response rates for that survey have dropped off the table.
Reade Pickert: They have. If you think back to sometime around September of 2017, the response rate for this survey was around 64%. And if you fast forward five years later, the response rate has dropped to less than half of that, sort of at just under 31% — which obviously is an enormous decline. And when we think about what this data has done in recent months, it’s been particularly volatile. So some economists have started to wonder if some of that volatility — at least in some part of it — is likely influenced by that sharp drop in response rates.
Ryssdal: So just to put some numbers to this, the government sends out something like 20,000 surveys, and we used to get 60% of back and now we get 30% of them back. Tell me why that matters.
Pickert: So the actual calculation is a little more complex, but overall, the picture of why this matters kind of revolves around this idea of non-response bias. And that’s essentially a fancy way of saying that if the people not responding to the survey are systematically different than those that do, then you have a problem. So, if a local flower shop doesn’t send in their retail sales in any given month, it may be OK, but that would be really different if an enormous company like Amazon didn’t send in their data.
Ryssdal: To be clear — at least for now, based on your reporting and the experts you’ve talked to — the data is still valid.
Pickert: That’s correct.
Ryssdal: OK. This is a question I probably should have led with, but why is this happening?
Pickert: So, the one answer that I got to this question from everyone I asked is that there is no one reason for this decline. It ranges everything from this idea of eroding trust in government and institutions to caller ID and less landlines. But you also just have this general survey fatigue. When you look around, there’s a survey at the end of every customer service call and at the bottom of your receipt.
Ryssdal: And I always say no, so maybe I’m the problem.
Pickert: So exactly. And when you say no, in that case, it really doesn’t matter. But if you say no to the surveys that inform some of the most important statistics that we have to look at, it really does become a problem.
Ryssdal: Okay, so let’s take the JOLTS survey that we started with where response rates are plummeting. At some point, if current trends continue, there will be no companies responding to the JOLTS survey. Then what? What’s the government doing to make sure we don’t get there?
Pickert: So hopefully, we never get there. But the government is very clear-eyed about this being a big problem and the main kind of tactic that they’re taking is this idea of blended statistics. So, when you think about that, it’s essentially asking people and businesses the questions that you can only get from those people and businesses through surveys. And then, using this range of other data that we get and collect through a variety of different third-party sources — this nonsurvey data — to kind of blend these statistics to get something that’s accurate and, in some cases, more detailed than the survey version. And there’s already some degree of this taking place. I can circle this back around to the closely watched CPI, the Consumer Price Index. So for instance, the used car prices — they use the J.D. Power Information Network. So there’s a lot of different options here. But it’s almost overwhelming to think about how big of a lift this is to do this across so many different indicators when they come out pretty much every day.
There’s a lot happening in the world. Through it all, Marketplace is here for you.
You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible.
Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.