The California Department of Motor Vehicles said this week it’s reviewing whether Tesla is telling people that its cars are self-driving when, legally speaking, they’re not. This follows fatal crashes that may have involved its Autopilot feature. Tesla advertises a “Full Self-Driving” upgrade option. One man has been busted in Teslas more than once for reckless driving. He hangs out in the backseat and steers with his feet.
Meanwhile, no cars are fully self-driving yet. I spoke with Missy Cummings, the director of the Humans and Autonomy Laboratory at Duke University. She says the so-called deep learning that cars need to see the road around them doesn’t actually learn. The following is an edited transcript of our conversation.
Missy Cummings: I can show a convolutional neural net a million images of a stop sign, and it will learn what a stop sign is from those images. But if it sees a stop sign that doesn’t match exactly those images, then it can’t recognize it. And this is a huge problem, because if a strand of kudzu leaves starts to grow across just the top 20% of a stop sign, that is enough to make that algorithm be dumb, and it doesn’t recognize it, because it’s never seen a stop sign with one strand of kudzu leaves across it.
Molly Wood: Is there an awareness of this, do you think? Is everybody sort of still trying to push the same solution when it comes to self-driving cars? Are people starting to realize these limitations and internalize them and change their approach at all?
Cummings: I think there are three camps of people not just in self-driving, but in robotics and artificial intelligence in general. There’s the camp of people like me who know the reality. We recognize it for what it is, we’ve recognized it for some time, and we know that unless we change fundamentally the way that we’re approaching this problem, it is not solvable with our current approach. There’s another larger group of people who recognize that there are some problems but feel like with enough money and enough time, we can solve it. And then there’s a third group of people that no matter what you tell them, they believe that we can solve this problem. And you can’t talk them off that platform.
Change the approach to the problem
The people who are the biggest problem are the people in that second group, the ones that believe that with enough time and money, we can fix it, instead of recognizing the elephant in the room for what it is, which is not fixable under our current approach. And this is why you see companies like Starsky, a trucking company that went out of business, and you starting to see all the mergers across the automotive industry where all companies are either teaming up with each other or with software companies, because they realize that they just cannot keep hemorrhaging money the way they are. But that pit still has no bottom. And I don’t see this becoming a viable commercial set of operations in terms of self-driving cars for anyone anywhere, ever, until we address this problem.
Wood: What about the tests that seem to be going fine, like Waymo testing fully operating taxis in Phoenix with no support drivers and not having crashes?
Cummings: Well, it’s interesting to me that that’s your perception, because that’s the perception that Waymo wants you to have. But the reality is behind every self-driving car in Phoenix, there are a team of humans that are remotely monitoring and potentially intervening when the systems go down. And there was recently a video that went viral on YouTube, where a Waymo car in its fully self-driving mode got stuck by a single orange cone. And eventually, they had to send a team of people to put a human in the car and drive the poor passenger out of the conundrum that he was in. And this is very recent. And you have to ask yourself, if we are having a car that supposedly is fully self-driving, so there’s no driver, but we have to have a team of four to six people managing that one car for problems. First of all, that’s not scalable.
And it’s not really self-driving if there has to be so much intervention. And if that’s where we are in technology today, how are we going to make a business out of this? And John Krafcik, the former CEO of Waymo, I mean, he recently left. I feel like John, in that way, was basically admitting what I have been trying to tell people all these years, is that we just can’t solve this problem in the way that you think we’re going to. We need to completely clean sheet this and start over.
Wood: So when we see Uber and Lyft, for example, essentially exiting this self-driving car business, do you think it is a sign of companies who have sort of said “this is just a money pit”?
Cummings: Yes, I think that’s exactly right.
Wood: On the other hand, we have Tesla, which is not doing any of that. What are we to make of that?
Cummings: Well, I think the Tesla situation is a little different. Should they be allowed to call their driving assist technology Full Self-Driving? So that’s one problem. And then, if you want to ask, is Tesla ever going to have a robotaxi program? I’ll tell you, that answer is no. But that problem is less acute for Tesla, because they have what is otherwise a great product. I love Tesla cars. I just think Autopilot and especially Full Self-Driving are both overhyped, and they underdeliver in terms of performance, and they’re dangerous.
Wood: So there’s a math problem and there is a pretty significant marketing problem?
“The religion of Tesla”
Cummings: That’s right. So I think that we’re starting to see increasing numbers of crashes in this country and abroad, where drivers think that their car is far more capable. I think what is interesting to me was how the Tesla driver who got arrested for being in his backseat while he was driving vocalized that he’d already been warned once, and he defied the warning and did it again, and then said he would keep doing it because he knows Elon [Musk] knows what he’s doing. And he fully believes in Tesla.
And so what I find most interesting about that statement is that one man is vocalizing what so many people believe. They believe that this technology really can be fully self-driving, despite all the warnings and despite all the statements and the owner’s manual, and you having to agree that you’re going to pay attention. Despite all of those warnings, there’s some belief likely based in calling a technology Full Self-Driving and calling it Autopilot where people believe in the religion of Tesla full self-driving, and that is dangerous.
Related links: More insight from Molly Wood
Tesla’s website does call the upgrade option Full Self-Driving, but also says the cars can’t drive themselves. Back in March, a Tesla engineer told the DMV that despite Elon Musk’s claims that the company would have a fully self-driving car by the end of this year, that claim did not match the engineering reality. For a refresher, there are five agreed-upon levels of autonomous driving, where No. 1 is you driving your car with your hands and feet, and No. 5 is the car doing everything while you sleep in the back seat in a way that does not endanger you or everyone around you. The Tesla engineer told the DMV, according to a memo produced by the agency, that Tesla is currently at level 2 — some self-driving functionality, but requiring a human with hands on the wheel at all times.
Here is a column about deep learning and why that author also thinks the tech is too limited to be able to adapt to real-world conditions, because the algorithms would have to pre-learn every single possible thing that could happen on the road — from a kudzu stop sign to a spaceship landing on the freeway. There’s also a little more reading and listening on Uber and Lyft getting out of the self-driving game. Marketplace’s Meghan McCarty Carino covered that on the “Marketplace Morning Report.
Apple might still be working on this — maybe? It got a patent in March for a night-vision tech system to let its self-driving cars see in the dark. Earlier this week, MacReports noted that Apple has increased the number of self-driving cars it’s registered with the DMV from 66 to 68, but dramatically decreased the number of people licensed to operate those cars. And it hasn’t applied for a driverless testing permit. Apple has been flirting off and on with self-driving cars for a few years now. First it laid off almost everyone involved with its so-called Project Titan, and then it bought a self-driving car startup that was almost defunct. So I guess that puts them in the second category of technologists that Missy Cummings described. We’ll call them the self-driving Dorys — you know, the fish that just keeps swimming.
The future of this podcast starts with you.
Every day, Molly Wood and the “Tech” team demystify the digital economy with stories that explore more than just “Big Tech.” We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.