Tesla will soon allow more drivers to get access to “full self-driving” mode, according to tweets by CEO Elon Musk. Drivers can pay $10,000 upfront or between $100 and $200 a month to use the software.
Up until now, a beta version has been available to a select group of people.
And the name “full self-driving” kind of implies that the car will drive itself. But, as Tesla notes on its website, that is not the case right now. These cars will not be autonomous.
Alistair Weaver is editor-in-chief at the car-shopping site Edmunds. The following is an edited transcript of our conversation.
Alistair Weaver: I’ve recently driven a Tesla Model S with full self-driving capacity, and to be honest, particularly around town, it performed very poorly. It’s absolutely not something that you can just take your hands off the wheel and eyes off the road and let the car drive itself. It’s, it’s many, many years away from that. So I find the whole name and title very misleading.
Marielle Segarra: Can you describe the ride for me?
Weaver: Yeah, I mean, just to give you one example, it looks for a traffic light recognition, and the traffic lights changed from red to green and we just sort of sat there and the car didn’t recognize the change in the conditions. I’m also far convinced, I experienced it on city streets, its inability to accurately judge the lanes. Because if you think about modern city streets, particularly some of those that are poorly surfaced, it doesn’t mean that all the, all the white lines are in place, or all the traffic is moving uniformly, and sometimes this car starts to take erratic decisions that a human would not. So at all times you need to remain fully in control of the vehicle. And it’s actually a system that I didn’t feel comfortable using away from the highway and actually found myself turning it off.
Segarra: Did you feel safe?
Weaver: No, not in, not in any way.
Segarra: Like, did you feel less safe in using Tesla’s full self-driving mode than you would if you had just been driving a regular car on your own?
Weaver: Yes, I think the problem with full self-driving mode is it’s not a full self-driving mode. And you need to remain in control of the vehicle at all times. I mean, you can try it, you can use it, but effectively what it is is an aid. It is not a replacement for the human behind the wheel. And one of my concerns, and Edmunds’ concerns, about the way that Tesla presents this information, markets it, if you like, is, you know, names like Autopilot are synonymous with the airline industry and planes flying themselves. Full self-driving — I mean, the very name suggests that, you know, you will get in the back seat and the car will just take you to your destination. It does not. And we are a long way from that technology.
Segarra: Is this innovation, like, you know, the idea of a fully autonomous car, is that important? Like, do we need that?
Weaver: There is an argument, I think it’s a potent one, that the idea of human beings driving around in a car, looking at mirrors, to decide where they turn while shouting at the kids in the back seat — as I’m sure we’ve all, we’ve all done — it feels very anachronistic almost. I mean, in 100 years, we’ll probably look back and say, “My goodness, that was madness,” that we’re killing all these people on the roads and people are driving around 70 miles or 65 miles an hour, whatever it is. So there is an argument that computers will be better at driving, ultimately, than humans in the same way that they’ve got better at flying airplanes. But airplanes exist in the sky and never get close to one another. They don’t have all these random variables that we have on the roads. So my own view, and this is Edmunds’ view as well, I think, is that in a few years’ time, driving in certain circumstances, such as the highway, this technology can be extremely useful to reduce fatigue, reduce the risk of an accident. But it’s about deploying the technology in the right way, so anywhere with a controlled environment, like a highway, makes a lot of sense. Anywhere you’ve got a random environment with a lot of chaos, like the city center, and I think we’ve got a long, long way to go.
Segarra: If Tesla promises full self-driving, but it’s really not, that’s not what this is, and there are more accidents, could that set back the whole autonomous car effort?
Weaver: I think so. I think there will be a natural reticence. I think it’s a natural human reticence about sitting in the back of your vehicle or turning the driver’s seat around to talk to the kids in the back seat, allowing the vehicle to drive itself. I think, we will all have to get used to trusting in that kind of technology. And it is a real conundrum that every manufacturer accepts, that the computer at times will get it wrong and people will die by vehicles that are driven autonomously. The argument is that less people will die. There’s always this age-old argument as well that as a human being sometimes you react to a situation. God forbid you’re in a situation where you have to choose where you have your accident — do you hit the the old lady crossing the road or the other mum with the, with the stroller on the sidewalk? Now a human being potentially can make that kind of decision, where a computer can’t. It is sort of a moral question, if you like. So there’s a lot of complexity to this. But yeah, the whole argument rests on the idea that road-traffic accidents and deaths will go down because of this, but I think it would be wrong to suggest that they will be eliminated completely.
Segarra: Do you have a sense of why Tesla is framing it this way? Like why would they call it “full self-driving” mode?
Weaver: Well, the very cynical argument about why you would do it is to A., get income from Tesla customers. So you know, people have spent up to $10,000 on technology that doesn’t really exist. And I think there have been examples of customers looking for their money back. But also, Tesla as a company has always run somewhat fast and loose with how it describes things and how it introduces technology. So we’d run some independent testing on Autopilot years ago, in the early days of it, where we showed some of the fallibilities. If you have been cynical about it, you could look at the share price and what Tesla is as a business and say that a lot of value as a business is built on being seen as pioneers of this technology. They have the Supercharger network for recharging their cars, which works very well. They have this autonomous driving technology, which also gets people excited, that helps to boost their appeal to Wall Street. But the idea that Tesla is way beyond other manufacturers in this sort of technology is a fallacy.
Related links: More insight from Marielle Segarra
Tesla declined to comment for this episode.
When we talk about what Tesla cars can do in terms of driving themselves, there’s full self-driving mode and then there’s Autopilot, which is less involved. It does things like automatic cruise control and corrective steering.
There have been a number of high-profile accidents caused by people using Tesla’s Autopilot function. Here’s a New York Times story about one of those accidents, which killed a 22-year-old woman in Florida.
I mentioned that there’s a beta version of full self-driving mode that a select group of people have been testing. You can see videos of their experiences on YouTube, including one where the car almost steered itself into a group of pedestrians. This story from the Verge explores that.
Now, part of the concern about this moment we’re in is that cars are still not fully autonomous. Which means you can’t be checked out while you’re driving them, and you might have to correct their mistakes.
Musk was asked about this at a conference earlier this week, and specifically, whether people would learn to adapt to this reality. He said the transition period to new technology is always a little bumpy. And he added:
“The truth is that people are actually not great at driving these two-ton death machines, you know, and people get tired, and they get drunk, and they get distracted, and they text and they do all sorts of things they shouldn’t do. And then the cars that crash, basically. Now, when we’re embarking on the autonomy front, someone told me a thing that’s quite true, which is even if you, for argument’s sake, reduce fatalities by 90% with autonomy, the 10% that do die with autonomy are still gonna sue you. The 90% that are living don’t even know that that’s the reason they’re alive.”
He went on to say that doing the right thing is more important than whether people perceive that you’re doing the right thing.
So, that’s his take.
In terms of getting sued, it’s an interesting question. If a fully autonomous car gets into an accident, who’s at fault? Who gets sued? And what might all this mean for the car insurance industry? Here’s a link to a story on that from Yahoo Finance.
And on Friday’s show, I’ll have a conversation about regulation or, uh, the lack of regulation for autonomous cars in the U.S.
The future of this podcast starts with you.
Every day, Molly Wood and the “Tech” team demystify the digital economy with stories that explore more than just “Big Tech.” We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.