Tesla fatal crash is setback to autonomous cars

02 Jul 2016 / 12:49 H.

IT could be a wake up call for the self-driving car movement.
The Tesla Model S cruising on "Autopilot" failed to pick up a crossing tractor-trailer against a bright sky, sending the driver to his death without any effort to hit the brakes.
This was the nightmare scenario for an industry promoting autonomous vehicles as a way to improve road safety and reduce traffic fatalities that come mostly from human error.
Researchers say the tragedy does not change the long-term outlook for autonomous vehicles or their potential benefits, but could dampen enthusiasm for this technology.
"Clearly this is a horrible thing, but in the big picture it doesn't affect the technology," said Richard Wallace, head of transportation systems analysis at the Center for Automotive Research in Ann Arbor, Michigan.
"But it may affect public perception of the technology, and obviously people have to buy these vehicles."
More than 30,000 Americans die annually in traffic incidents caused by human error, according to government data.
"But if these deaths are caused by non-human drivers there will be people who find that unpalatable," Wallace said
Mary Cummings, who heads the Humans and Autonomy Laboratory at Duke University, said the Tesla crash shows the industry is moving too fast to deploy self-driving vehicles.
"My concern is that this was an avoidable accident," Cummings told AFP. "My concern is that this will set the industry back."
Weighing risks
Cummings, who warned against premature deployment of the technology at a Senate hearing earlier this year, said she believes self-driving cars will be beneficial in the long term but that they should not be on the road before they are ready.
"There will be unknowns, where it will be a complete surprise to the engineering community, and I can live with that," she said. "But I don't think we should take risks that we don't need to take."
Cummings said Tesla was aware of the "blind spot" in Autopilot and should have known that drivers will often ignore warnings about remaining vigilant when using the semi-autonomous system.
"I think what Tesla needs to do is fix the blind spots, and if they can't fix them they should turn Autopilot off," Cummings said.
Tesla announced the fatality on Thursday and noted that US safety officials had opened a probe.
In a statement, Tesla said the fatality was "a tragic loss" and was the first such incident with its Autopilot system activated.
Tesla said the Autopilot system, introduced last year, is not a fully autonomous system and that drivers are cautioned that they need to be at the wheel and in control.
The system allows the vehicle to automatically change lanes, manage speed and brake to avoid a collision. The system may be overridden by the driver.
Tesla said that in the fatal crash in Florida, the "high ride height" of the trailer combined with its positioning were "extremely rare circumstances" and that the driver would have been protected in most other collisions.
Fear of machines
The news came as Germany's BMW announced that it is joining forces with US computer chip giant Intel and the Israeli technology firm Mobileye to develop self-driving cars, aiming for fully automated driving in production cars by 2021.
The companies said in a statement that "the future of automated driving promises to change lives and societies for the better" while acknowledging that "the path to get to a fully autonomous world is complex."
Most other major automakers are also looking at autonomous cars. South Korea's Kia has pledged to produce a self-driving car by 2020 and General Motors plans to test the technology with ridesharing giant Lyft.
Google has driven its autonomous cars some 2.4 million kilometres with only some minor dust-ups.
David Strickland, who heads a newly formed Self Driving Coalition for Safer Streets, said the organization, which includes Ford, Google, Lyft, Uber and Volvo, remains "dedicated to developing and testing fully autonomous vehicles" with the goal of improving road safety.
A Rand Corp. study meanwhile said it would require some 275 million miles of testing to ensure reliability of self-driving technology, and even then "it may not be possible to establish with certainty the safety of autonomous vehicles."
A separate study in Science magazine noted that autonomous vehicles may be forced to make difficult moral decisions such as whether to sacrifice a passenger or a pedestrian.
What remains unclear is whether the Tesla incident will cause further mistrust of autonomous driving.
A survey earlier this year by the AAA auto club showed 75 percent of US drivers would be afraid to ride in an autonomous vehicle.
The Tesla fatality "will have a short-term impact with consumers – they may not be as willing to trust a system like this," said Ron Montoya, consumer advice editor at the auto research firm Edmunds.com.
"We expect our machines to be perfect, and when they don't deliver on that it does cause some fear. But I think down the line autonomous technology will be the future for all cars." — AFP

sentifi.com

thesundaily_my Sentifi Top 10 talked about stocks