Article 6CE0S Billionaire Brings Tesla Autopilot Rebuke

Billionaire Brings Tesla Autopilot Rebuke

by
Philip E. Ross
from IEEE Spectrum on (#6CE0S)
one-hand-sits-on-the-steering-wheel-insi

Yesterday, in a livestreamed event, Dan O'Dowd-a software billionaire and vehement critic of Tesla Motors' allegedly self-driving technologies-debated Ross Gerber, an investment banker who backs the company. The real challenge came after their talk, when the two men got into a Tesla Model S and tested its Full Self-Driving (FSD) software-a purportedly autonomous or near-autonomous driving technology that represents the high end of its suite of driver-assistance features the company calls Autopilot and Advanced Autopilot. The FSD scrutiny O'Dowd is bringing to bear on the EV maker is only the latest in a string of recent knocks-including a Tesla shareholder lawsuit about overblown FSD promises, insider allegations of fakery in FSD promotional events, and a recent company data leak that includes thousands of FSD customer complaints.

At yesterday's livestreamed event, O'Dowd said FSD doesn't do what its name implies, and that what it does do, it does badly enough to endanger lives. Gerber disagreed. He likened it instead to a student driver, and the human being behind the wheel to a driving instructor.

Ross Gerber, behind the wheel, and Dan O'Dowd, riding shotgun, watch as a Tesla Model S, running Full Self Driving software, blows past a stop sign.The Dawn Project

In the tests, Gerber took the wheel, O'Dowd rode shotgun, and they drove around Santa Barbara, Calif.-or were driven, if you will, with Gerber's assistance. In a video the team published online, they covered roads, multilane highways, a crossing zone with pedestrians. At one point they passed a fire engine, which the car's software mistook for a mere truck: a bug, though no one was endangered. Often the car stopped hard, harder than a human driver would have done. And one time, it ran a stop sign.

In other words, you do not want to fall asleep while FSD is driving. And, if you listen to O'Dowd, you do not want FSD in your car at all.

O'Dowd says he likes Tesla cars, just not their software. He notes that he bought a Tesla Roadster in 2010, when it was still the only EV around, and that he has driven no other car to this day. He bought his wife a Tesla Model S in 2012, and she still drives nothing else.

We've reported dozens of bugs, and either they can't or won't fix them. If it's won't,' that's criminal; if it's can't,' that's not much better." -Dan O'Dowd, the Dawn Project

He'd heard of the company's self-driving system, originally known as AutoPilot, in its early years, but he never used it. His Roadster couldn't run the software. He only took notice when he learned that the software had been implicated in accidents. In 2021 he launched the Dawn Project, a nonprofit, to investigate, and it found a lot of bugs in the software. Dowd published the findings, running an ad in The New York Times and a commercial during the Super Bowl. He even toyed with a one-issue campaign for the U.S. Senate.

In part he is offended by what he regards as the use of unreliable software in mission-critical applications. But note well that his own company specializes in software reliability, and that this gives him an interest in publicizing the topic.

We caught up with O'Dowd in mid-June, when he was preparing for the live stream.

IEEE Spectrum: What got you started?

a-headshot-of-a-silver-haired-man-in-a-sDan O'Dowd's Dawn Project has uncovered a range of bugs in Tesla's Full Self-Driving software.

Dan O'Dowd: In late 2020, they [Tesla Motors] created a beta site, took 100 Tesla fans and said, try it out. And they did, and it did a lot of really bad things; it ran red lights. But rather than fix the problems, Tesla expanded the test to 1,000 people. And now lots of people had it, and they put cameras in cars and put it online. The results were just terrible: It tried to drive into walls, into ditches. Sometime in 2021, around the middle of the year, I figured it should not be on the market.

That's when you founded the Dawn Project. Can you give an example of what its research discovered?

O'Dowd: I was in a [Tesla] car, as a passenger, testing on a country road, and a BMW approached. When it was zooming toward us, our car decided to turn left. There were no side roads, no left-turn lanes. It was a two-lane road; we have video. The Tesla turned the wheel to cross the yellow line, the driver let out a yelp. He grabbed the wheel, to keep us from crossing the yellow line, to save our lives. He had 0.4 seconds to do that.

We've done tests over past years. For a school bus with kids getting off, we showed that the Tesla would drive right past, completely ignoring the school zone" sign, and keeping on driving at 40 miles per hour.

Have your tests mirrored events in the real world?

O'Dowd: In March, in North Carolina, a self-driving Tesla blew past a school bus with its red lights flashing and hit a child in the road, just like we showed in our Super Bowl commercial. The child has not and may never fully recover. And Tesla still maintains that FSD will not blow past a school bus with its lights flashing and stop sign extended, and it will not hit a child crossing the road. Tesla's failure to fix or even acknowledge these grotesque safety defects shows a depraved indifference to human life.

You just get in that car and drive it around, and in 20 minutes it'll do something stupid. We've reported dozens of bugs, and either they can't or won't fix them. If it's won't,' that's criminal; if it's can't,' that's not much better.

Do you have a beef with the car itself, that is, with its mechanical side?

O'Dowd: Take out the software, and you still have a perfectly good car-one that you have to drive.

Is the accident rate relative to the number of Teslas on the road really all that bad? There are hundreds of thousands of Teslas on the road. Other self-driving car projects are far smaller.

O'Dowd: You have to make a distinction. There are truly driverless cars, where nobody's sitting in the driver's seat. For a Tesla, you require a driver, you can't go to sleep; if you do, the car will crash real soon. Mercedes just got a license in California to drive a car that you don't have to have hands on the wheel. It's allowed, under limits-for instance, on highways only.

There is no testing now of software in cars. Not like in airplanes-my, oh my, they study the source code." -Dan O'Dowd, the Dawn Project

Tesla talks about blind-spot detection, forward emergency braking, and a whole suite of features-called driver assistance. But basically every car coming out now has those things; there are worse results for Tesla. But it calls its package Full Self-Driving: Videos show people without their hands on the wheel. Got to prove you are awake by touching the wheel, but you can buy a weight on Amazon to hang on the wheel to get round that.

How might a self-driving project be developed and rolled out safely? Do you advocate for early use in very limited domains?

O'Dowd: I think Waymo is doing that. Cruise is doing that. Waymo was driving five years ago in Chandler, Ariz., where it hardly ever rains, the roads are new and wide, the traffic lights are normalized and standardized. They used it there for years and years. Some people derided them for testing on a postage stamp-size place. I don't think it was mistake-I think it was caution. Waymo tried an easy case first. Then it expanded into Phoenix, also relatively easy. It's a city that grew up after the automobile came along. But now they are in San Francisco, a very difficult city with all kinds of crazy intersections. They've been doing well. They haven't killed anyone, that's good: There have been some accidents. But it's a very difficult city.

Cruise just announced they were going to open Dallas and Houston. They're expanding-they were on a postage stamp, then they moved to easy cities, and then to harder ones. Yes, they [Waymo and Cruise] are talking about it, but they're not jumping up and down claiming they are solving the world's problems.

What happened when you submitted your test results to the National Highway Transportation Safety Administration?

O'Dowd: They say they're studying it. It's been more than a year since we submitted data and years from the first accidents. But there have been no reports, no interim comments. We can't comment on an ongoing investigation,' they say.

There is no testing now of software in cars. Not like in airplanes-my, oh my, they study the source code. Multiple organizations look at it multiple times.

Say you win your argument with Tesla. What's next?

O'Dowd: We have hooked up everything to the Internet and put computers in charge of large systems. People build a safety-critical system, then they put a cheap commercial software product in the middle of it. It's just the same as putting in a substandard bolt in an airliner.

Hospitals are a really big problem. Their software needs to be really hardened. They are being threatened with ransomware all the time: Hackers get in, grab your data, not to sell it to others but to sell it back to you. This software must be replaced with software that was designed with people's lives in mind.

The power grid is important, maybe the most important, but it's difficult to prove to people it's vulnerable. If I hack it, they'll arrest me. I know of no examples of someone shutting down a grid with malware.

External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/IeeeSpectrum
Feed Title IEEE Spectrum
Feed Link https://spectrum.ieee.org/
Reply 0 comments