Humans are to blame for crashes - autonomous cars won't fix that
Published
Matt Prior takes aim at people peddling false narratives about autonomy because it furthers their own causes
Who is to blame for crashes? A study by the US’s National Highway Traffic Safety Administration suggested the “critical reason” was “the driver”. In 94% of cases this is true, according to the results.
This statistic has been misused by various advocates and enthusiasts of driver assistance, autonomous vehicles and LinkedIn over the years to the point that the NHTSA today goes to great lengths to qualify its findings.
The survey was big: it sampled 5470 crashes between 2005 and 2007. In its results, the NHTSA defined the ‘critical reason’ as ‘the last event in the crash causal chain’.
Ergo driver actions weren’t necessarily the root cause but were the last significant event before impact.
Yet nearly two decades on, people will tell you that if you automate driving or remove ‘human error’, you will prevent more than nine in 10 crashes.
This “would be great, but it’s simply wrong”, said Antonio Avenoso, the European Transport Safety Council’s executive director, back in 2019.
He blamed the continuing recurrence of the claim on a “fundamental misunderstanding” – perhaps rather kindly, because that implies the misunderstanding was accidental, and I suspect most know they’re using the statistic incorrectly but continue to because it furthers their cause.
The truth is that humans are actually the root cause of 100% of all crashes. Even if a wheel falls off of a car, recorded now as mechanical failure, it’s not the wheel’s fault, but either its design or its maintenance, all of which was done by people.
Look hard enough and you will find human error everywhere in life. In my garden as I write, a squirrel and a magpie are having a tense face-off over some spilled bird seeds.
The squirrel is standing firm, but as with all greys, he’s only here because Victorians imported his ancestors – a human error.
He’s only going after bird seeds on the ground because the feeder was poorly designed or I overfilled it – either way, human error.
And the magpie is only in my garden because we’ve reduced his habitat elsewhere. Without humans cocking up, this seemingly naturalistic scrap wouldn’t be happening at all.
If one can attribute human errors to scenes like this, where do advocates of the 94% statistic think blame for crashes no longer caused by drivers should go?
A BBC headline from last month read “NHS computer issues linked to patient harm”, but it isn’t the computers that are the root problem, it’s the people who designed it and used it.
You can’t blame computer systems for the Post Office Horizon IT scandal: it was the people behind it, many of whom we’ve still never heard apologise.
And what worries me is that as automation increases, accountability decreases. If I drive into you, the law can find me easily responsible, you can look me in the eye, I can look back at you and I can say “I’m sorry”.
If my car crashes into you, even if automation makes that event much, much less likely, it will still ultimately be somebody’s fault, but who knows whose? Who apologises? Who holds the system accountable?
We’re talking about potential injury, life-changing physical and mental harm, death and grief. At the moment, too much of it.
Automation should (and, thanks to stability control, ABS, airbags and more, already does) reduce the amount, but given personal accountability inevitably reduces alongside it, finding an ‘acceptable’ level of automated collisions – and casualties – is very difficult.
At the moment, people are at the wheel and people are ultimately responsible. If we’re not, even the wrongly advocated 94% reduction in collisions wouldn’t be enough.
We tolerate people making mistakes because we’re human. We have the capacity for remorse and forgiveness. Without those, the bar is zero harm.