We fell short of our Fall Fundraiser goal of 2,500 donations. Help us catch up ⏩ Give Now
The road ahead: What about regulation for self-driving cars?
Oct 1, 2021

The road ahead: What about regulation for self-driving cars?

HTML EMBED:
COPY
While availability for autonomous vehicles has increased, regulation has struggled to keep up.

On Wednesday’s show we talked about Tesla’s full self-driving mode, which it is about to make available to more drivers. And yes, the name implies that the cars will drive themselves. But the technology isn’t there yet. A human will still have to be in control.

And that’s where we are right now with self-driving cars. They might help you drive, but they might also make a mistake that causes an accident if you’re not paying attention.

The technology is advancing, and yet there is no federal regulation of self-driving cars. And that’s created a patchwork of different rules across the country.

I talked about it with Jason Levine, executive director of the Center for Auto Safety. The following is an edited transcript of our conversation.

Jason Levine: Traditionally, we think of the states as where we regulate drivers and the federal government is where we regulate cars. So states have tried to sort of toe the line a little bit in terms of finding ways to help protect individuals and drivers and anyone on the road in their states, but at the same time allow for some innovation. Some states have required companies to submit permits if they want to try and test vehicles or deploy them commercially on their roads. California is an example of that. But it’s an attempt to sort of thread that needle.

Segarra: What kind of regulation do you think would be effective? What would it look like?

Levine It’s going to look, hopefully, similar to what we’ve done historically, which is to take a look at different features, different pieces of technology, and find ways to set performance standards — to set out how well must the vehicle see using sensors? In what environments does it need to be able to see differently, whether it be fog, or rain, or snow, or dark, or light? That’s something that is one piece of a regulatory structure that is going to be needed when we talk about how to make sure that these vehicles are safe before we are selling them and allowing them just to be commercially everywhere.

Jason Levine smiles wearing a black blazer and salmon colored tie in front of a series of books and a poster featuring the Center for Auto Safety."
Jason Levine (Courtesy Center for Auto Safety)

Segarra: Why do you think there isn’t regulation at the federal level?

Levine: I mean, that’s a great question. There have been a few attempts in Congress to provide what we would consider at the Center for Auto Safety sort of an on-ramp to get vehicles on the road without necessarily undertaking the difficult work of writing performance standards. I mean, keep in mind, it’s not a simple task to go ahead and regulate this technology. But certainly there’s also been a significant pushback from industry, particularly technology companies which have entered this space, who have not traditionally been subject to any sorts of regulation. And then add into it, certainly, the last administration was, to put it mildly, a deregulatory philosophy governed there. So it’s a combination of factors.

Segarra: There have been a number of high-profile accidents involving Tesla’s cars. What kind of movement are we seeing from federal agencies in response to those crashes?

Levine: There has been some recent movement. The National Highway Traffic Safety Administration recently opened a formal investigation into 11 separate crashes involving Teslas that were believed to be on autopilot into emergency vehicles that were stopped on the side of the road — ambulances, firetrucks, police cars. And in fact, between the time they announced investigation and the time they formally sent a letter to Tesla requesting information, there was yet another crash involving one of these vehicles and a stopped emergency vehicle. So that’s certainly a step in the right direction. But that said, there hasn’t been much more than that, so we’ll see if this is going to lead to some specific steps and measures to try and blunt this problem coming from Tesla and how quickly that happens.

Segarra: Is the U.S. an outlier here? Like, have you seen other countries try to regulate autonomous cars?

Levine: We’re a bit of an outlier in that we seem to be making very little progress towards a regulatory structure and starting to write these rules. Europe has taken some steps. Asia has taken some steps, particularly Japan. I think what we have unfortunately done is a little bit taken a step back in terms of moving the idea forward that we’re going to need regulations. In fact, at this point, even the auto industry has suggested that some level of regulations are needed. Now, there’s going to be some questions around what the industry might want versus what stakeholders who are more on the side of concerns about the safety of drivers, passengers and pedestrians might want, but there’s a recognition something is needed.

Segarra: How are tech and car companies testing the safety of their self-driving modes?

Levine: They’re using you and me and everyone in your neighborhood as part of their experiment, quite frankly, they’re just putting them out on the road. Now, they’re doing a lot of computer simulation testing in certain circumstances, and some of them are also doing closed-track testing. But they’re also just putting vehicles out on public roads, public highways, neighborhood streets, across the country, and collecting data and seeing how it goes. That is obviously something that most people aren’t aware of, and no one really signed up for. So that’s obviously a concern of ours. Recently, the National Highway Traffic Safety Administration announced an order for all companies who have what is called level twos — think about a Tesla or a GM Super Cruise sort of vehicle — or higher test vehicles to submit data on any crashes that they are involved in. That’s certainly a good step, start collecting that data. That’s been going on for [more than] two months now and has not been made public yet. We’re looking forward to seeing made public. So that’s the beginning of our process of collecting enough data to make sure that these vehicles not only are cool, but are actually safe.

Segarra: What do you think would be the ideal way to test these cars then?

Levine: There are frameworks that think about testing new technology in a graduated fashion. So yes, you would start in a lab, and then you’d move to a test track. And then, yes, perhaps eventually you would move to a public space. But one, you’d want to make sure you coordinate with local authorities to make sure that they’re aware that this is going on, and in between each of those steps, there would be some sort of third party certifying that you have met the goals for each individual step.

Related Links: More insight from Marielle Segarra

Levine said, by the way, one of the challenging things about creating this regulation is figuring out how self-driving cars will interact with our existing world — more than 200 million human drivers in the U.S. who have mostly regular cars — and also what kind of new infrastructure we’ll need. It’s hard enough to get money to fix what we have.

Some federal lawmakers have been pushing for regulation on self-driving cars for years, but it just hasn’t happened. In fact, we had Republican Sen. John Thune of South Dakota on the show in 2017 talking about that effort. Part of the problem is disagreement over what those laws should include and whether the proposals do enough to protect consumers.

Levine talked about the patchwork of state laws on self-driving cars. Here’s a link to a story from Lifewire that lays out the laws in each state.

We also talked about testing — what’s ideal, what’s ethical. And there’s an interesting question here about how much risk people will accept. This Wall Street Journal piece references a survey asking people how much safer an autonomous car would need to be than the average driver for them to be willing to ride in it. And people’s standards were much higher for self-driving cars. One reason might be that people think they’re safer than the average driver.

Another is something called “betrayal aversion” — when a product is supposed to make something safer, people feel betrayed if it actually causes harm.

And all of that is to say regulating self-driving cars is not going to be easy. Lawmakers will have to consider what level of risk the public will accept.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer