Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace

The nest is full

Oct 11, 2019

Latest Episodes

Download
HTML Embed
HTML EMBED
Click to Copy
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Download
HTML Embed
HTML EMBED
Click to Copy
Download
HTML Embed
HTML EMBED
Click to Copy
This Is Uncomfortable
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Morning Report
Download
HTML Embed
HTML EMBED
Click to Copy
Marketplace Tech Blogs

The most dangerous stage in self-driving innovation

Molly Wood and Shaheen Ainpour May 2, 2018
Share Now on:
HTML EMBED:
COPY
A driver presents a Cruising Chauffeur, a hands free self-driving system designed for motorways during a media event by Continental to showcase new automotive technologies on June 20, 2017 in Hannover, Germany. 
Alexander Koerner/Getty Images

Self-driving cars will probably save a lot of lives in the future. But right now, the tech is new, and most of it requires human intervention. Experts refer to several levels, 1 through 5, of automation in cars. A 5th level car would have no steering wheel or gas pedal. Several cars on the market now fit into the middle category; requiring human intervention with some autonomous features. Marketplace Tech host Molly Wood spoke with Missy Cummings, director of the Humans and Autonomy Lab at Duke University, about the risks of having humans only partly in control. The following is an edited transcript of their conversation. 

Missy Cummings: Well I think that one of the problems with these levels are that they seem linear and that we should go in order: No. 1, 2, 3, 4 and 5. But the reality is there are really two different paths. There’s the 1, 2, 3 path, and then there’s the 1, 2, 4, 5 path. And the reason that this is an issue is because level 3 which is where the automation is partially capable but not fully capable, and we have to have that in the cases where the cars can’t perform under all conditions, then the car hands over control back to the human. And this is the deadliest phase. In fact, I’m pretty much against level 3. I don’t think it should exist at all. Because one thing I know as a former fighter pilot is having a human step inside the control loop at the last possible minute is a guaranteed disaster.

Molly Wood: And yet we see car makers going there. In fact, I think we arguably see Tesla pushing that on consumers. Does it make drivers part of a living R&D lab on city streets?

Cummings: Well I think there are two issues here. No. 1, should we have cars that have to ask for human intervention at time-critical periods? That answer is no. But there’s a separate issue of should we allow car makers to use the American public as guinea pigs to test out these new technologies. And I also think that that answer is no.

Wood: How do automakers need to communicate to car owners when they’re safe to use?

Cummings: Well I think this is a real problem because in the aviation world when we put fancy new automated technologies, which are actually not as complex as ones that exist in cars right now which I think is a big surprise to most people, we make commercial pilots go through years of training. They have annual check rides. They have to show that they know the rules, they have to take a flight exam with a flight examiner to show that they can operate the vehicle under conditions of uncertainty. We’re not doing that for human drivers, right. You know this is a real issue because if you have these complex modes of operation which we know exist in these driverless cars, or even driver-assist cars, and we know people aren’t reading the manuals, then arguably are we doing the ethical thing by allowing them to be on the roads? And I think the answer is no.

Wood: So what needs to happen, setting aside what we think might or might not in the current environment, what does need to happen from NHTSA or any other officials?

Cummings: So, the reality is we still need to invest in this technology. I’m not a Luddite. I run a robotics lab. I want the technology to continue to progress. And so, one of the things that I personally have been advocating for is that we need to establish vision tests for autonomous vehicles. One of the things that we know about driverless cars are that their perception systems, what makes them see, are deeply flawed. So, we know that the perception systems are deeply flawed then we need to have a set of tests that the cars must be able to pass before they’re allowed on public roads.

 

If you’re a member of your local public radio station, we thank you — because your support helps those stations keep programs like Marketplace on the air.  But for Marketplace to continue to grow, we need additional investment from those who care most about what we do: superfans like you.

Your donation — as little as $5 — helps us create more content that matters to you and your community, and to reach more people where they are – whether that’s radio, podcasts or online.

When you contribute directly to Marketplace, you become a partner in that mission: someone who understands that when we all get smarter, everybody wins.

Check Your Balance ™️
Check Your Balance ™️
Personal finance from Marketplace. Where the economy, your personal life and money meet.

Thank you to all the donors who made our fall drive a success!

It’s Investors like you that keep Marketplace going strong!