Safety regulators in the United States are investigating a fatal crash involving a Tesla Model S that was using an “Autopilot” feature, Tesla Motors announced in a blog post Thursday.
“What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S,” Tesla wrote. “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”
The company said the crash took place under “extremely rare circumstances,” and that it was fatal because of where the Model S hit the trailer. It added that Autopilot is disabled by default and that a driver must explicitly acknowledge it’s experimental technology before using it.
The Model S will also tell drivers to keep their hands on the wheel; it alerts drivers and eventually slows down if they release their grip. Of course, even an instant of distraction can be deadly on the road.
Tesla released Autopilot on some of its vehicles last year. The limited autonomous feature allows the car to drive itself on major highways in the four states — California, Florida, Michigan and Nevada — plus the District of Columbia that have legalized self-driving cars on major roadways. When on private property, Tesla owners can summon their Autopilot-enabled cars with the click of a button. Tesla said the feature is disabled on public land, so the car can’t drive through a parking lot to meet its driver at the entrance to a store, for instance.
Tesla argues that, when used in conjunction with driver oversight, semi-autonomous vehicles are much safer “compared to purely manual driving.” Indeed, CEO Elon Musk predicted last year that human drivers would be outlawed once self-driving technology became more ubiquitous.
The probe by the National Highway Traffic Safety Administration may become a setback for the growing number of tech and car companies investing heavily in autonomous driving technology. Regulators scrambled last year to write new rules for self-driving cars after Tesla announced hasty plans to release its limited Autopilot feature. But by then, the company had already sent ripples through the auto industry. In January, at the Consumer Electronics Show — more or less the Detroit Auto Show of tech — it seemed nearly every major car company unveiled some kind of autonomous feature. The first death in a self-driving car could stoke fears over the technology and temper the industry’s growth.
The majority of traffic accidents, which result in about 35,000 deaths in the U.S. every year, are caused by human error. Manufacturers of self-driving cars promise to make driving safer. But scientists and ethicists disagree on what a driverless car should be programmed to do if an accident becomes inevitable.
That said, Tesla has weathered safety concerns before. Almost exactly two years ago, a man died after crashing a stolen Model S into a steel pole, splitting the car in two and igniting a fire in the sedan. The death exacerbated already simmering concerns over the safety of Tesla’s battery-powered vehicles. But the company bounced back and earlier this year sold nearly half a million preorders for the Model 3, its newly announced cheaper set of vehicles.
Bahar Gholipour contributed reporting.
— This feed and its contents are the property of The Huffington Post, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.