Tesla crash report blames human error - this is a missed opportunity
In blaming human error for a self-driving car crash, US regulators have missed an opportunity to learn from such incidents
The Tesla Model S is an extraordinary machine. As part of my research into the regulation of self-driving cars, I've had the privilege of driving one. Or more accurately, I've had the privilege of being driven by one. On a Colorado highway in July, with some trepidation, I flicked the lever to engage Autopilot mode. I told the representative from Tesla that I was worried about handing over control, taking my feet off the pedals and my hands off the wheel. She reassured me that I would quickly get used to it.
My curiosity was at least partly morbid. In May, a Tesla Model S was implicated in the world's first fatal self-driving car crash. Joshua Brown was behind the wheel, but he was not in control of his car. As far as we know, neither he nor his car's sensors detected a truck that had driven across his path. The car did not brake. It drove at 74mph under the truck's trailer, crushing the car's roof before leaving the road and hitting a post, killing its driver.
Continue reading...