Fatal car crashes are a rarity, but there are still 40,000 of them every year in the United States alone. One goal of autonomous vehicles is to bring that number as close to zero as possible. However, we still have an awful long way to go even to get the number down to 100 million miles driven between fatalities.
“Ultimately for validation requirements, that’s something that will be measured when autonomous systems are deployed: What is their fatality rate?” said David Agnew, vice president of business development at Dataspeed. Agnew and others were gathered at the recent M:bility conference in Dearborn to discuss the future of autonomous testing, among other mobility topics.
While everyone looks to artificial intelligence as the solution, it turns out visual recognition algorithms are no match for the real thing despite the vast amounts of money being invested in the space.
“Artificial intelligence is not real intelligence,” said Chuck Brokish, director of automotive business development at Green Hills Software warned. Green Hills develops software for the likes of government aerospace applications including the B-1 bomber and F-15 fighter jet systems where failure would result in catastrophic loss of life.
He’s seen tests where simply putting a piece of tape on a stop sign causes an autonomous vehicle system to mistake the bright red octagon for a bus — something that the average driver would (hopefully) never do.
“AI does make mistakes. It’s usually right, but it does make mistakes just like humans do,” Brokish said. He continued that we need to have a safety plan in place, one that doesn’t exclusively rely on a human in the front seat. He cited last year’s fatal accident involving an autonomous Uber with a human safety driver behind the wheel and a pedestrian as an example of their shortcomings. “Safety drivers don’t engage,” he said.
The automotive industry has been making driving easier since the advent of the automatic transmission. Now there is automatic emergency braking, radar-assisted lane departure warnings, and radar-assisted cruise control.
“We’ve been actively disengaging the driver, so we need to have a safety plan that can be certified as safe,” he said. The operating systems running the power grid, aviation fields, and nuclear technology don’t go to market without certification, and he said we can’t afford to let autonomous systems go to market at scale without the same types of independently certifiable safety systems and redundancies.
“Software failure flat-out isn’t an option, because if the software was to fail, there would be mass casualties.”