As seen on:

SMH Logo News Logo
Press Release

Call 1300 303 181

Australia’s Best New Car News, Reviews and Buying Advice

Driverless Car Causes Fatal Accident In Arizona

Photo courtesy of Reuters

On 18th March – that’s just over a week ago – driverless car technology received a major blow.  The horrible truth is that the blow struck to the technology by this particular vehicle being road-tested by the Uber taxi service wasn’t as nasty as the blow it delivered to a 49-year-old Arizona woman named Elaine Herzberg who was crossing the road one evening, like you do.  The car hit her and killed her.  Dashcam on the autonomous car captured the moment before the car ran her down.  I’ve decided not to embed it in this post in case you’ve got autoplay or something on, because it’s decidedly disturbing.  Find it online yourself if you must, but personally, I’d rather not watch the tragic and completely avoidable death of a woman about my age who probably has a partner and children and friends who thought she was great fun – someone just like me and you.

The reaction has been exactly what you would expect: Arizona has called a halt to on-road real-life testing of autonomous cars, Uber and a few other companies like Toyota have stopped all testing in North America, and shares in companies that have been investing heavily into driverless car technology such as Tesla have dropped.  In addition, Ms Herzberg’s family have been coping with the shock and loss of losing a mother, daughter, sister, wife, cousin…  There’s also one Uber driver who trusted the technology to take care of things the way they told her it would who is going to live with a lifetime of questions and guilt, and who is probably in the hands of a good therapist right now – or at least ought to be.

We can ask the same questions as that Uber driver and the Herzberg family are probably asking over and over again: why did this happen? What went wrong? Aren’t driverless cars supposed to get rid of the human error factor that is responsible for the majority of fatal accidents?

Without actually looking at the chilling dashcam footage personally and based on other people’s reports, it appears that what happened was this.  The Uber vehicle was cruising along a road on a normal spring night in Tempe, Arizona, on a Sunday night.  It was dark and the driver, who was probably on a tight schedule and having to manage half a billion things at once – like you do – looked away from the road for about five seconds.  The car was in autonomous mode and it had the full fleet of sensors that are available in even regular cars that aren’t driverless cars, such as automatic braking, pedestrian detection, cross-traffic detection and collision avoidance mode.  The driver thought that all would be well – after all, the car was supposed to take care of itself most of the time, wasn’t it?

Then along came Ms Herzberg, wheeling her bicycle.  Probably she was a bit too careless and didn’t pick a big enough gap in the traffic to cross in – but haven’t we all done that when trying to cross a busy road when there’s no pedestrian crossing or traffic lights in sight?  Most of us take it for granted that the humans behind the wheels don’t want to hit us and they’ll slow down a fraction if we’re cutting it a bit fine (this is something that I don’t assume – call me paranoid but maybe it’s an assumption we need to start questioning).  To make matters worse, Ms Herzberg was wearing black at night, which would have made her hard to see even if the driver hadn’t looked away.

The sensors and the system didn’t see or recognize Ms Herzberg, so the collision avoidance systems weren’t triggered.  The vehicle kept going straight ahead at normal road speeds.  The driver, trusting the autonomous system, didn’t see her either until the last moment when the car ploughed full-speed into her and there was no time for the human driver to do anything to stop it.  Ms Herzberg died later that night in hospital.

This is the first time that a driverless car has been involved in a fatal accident involving a pedestrian – hang on, let’s call a spade a spade.  The car wasn’t just “involved”: it knocked her down and killed her.

Naturally, all the tech companies and car manufacturers involved are properly horrified and are wondering what on earth went wrong.  The sensors were supposed to work without being “distracted” like a human driver could be.  They were supposed to be able to see in the dark, so to speak, and therefore be better than a human driver would be.  Autonomous systems are supposed to be so much safer because they don’t get drunk, tired or distracted, but stay focussed and on the job all the time. So what went wrong?  Why didn’t the car see Ms. Herzberg and brake in time?

Naturally, as the questions are still being answered and the accident only happened about a week ago, they don’t have answers yet.  A few fingers are being pointed, especially as different companies make different bits of the tech.  Did the Lidar sensor plus artificial intelligence system fail to distinguish the pedestrian with a bicycle from a power pole or a bush? (These systems do have trouble with this – in Australia, they have real trouble recognizing how close kangaroos on the road actually are, because the jumping motion of a roo fools the sensor into thinking that there’s more road between the car and the roo than there really is.)  Robotic systems and computers follow the rules and keep to the rules no matter what – and something unexpected that’s out of the box and not included in the rules really throws them.  Possibly, someone crossing the road with a bike without looking properly or allowing a big enough gap is a novel concept for them.

I guess that at this early stage, there are a few lessons that all of us can learn from this tragedy:

  • Driver assistance packages and sensors are there to help you be a better driver, not do it all for you. As a driver, you need to stay alert and do the job of driving at all times, whether you’ve got a back-to-basics trade vehicle like a Great Wall , or a luxury sedan or SUV with all the safety gadgets like a Mercedes  or Volvo .
  • A lot can happen in a few seconds, so keep your eyes on the road as much as possible. No checking texts, changing the radio station or fiddling with the air con.
  • Be careful when crossing the road. These days, you can’t assume that drivers are looking ahead of them because there are idiots who insist on checking their phones while driving, and in the future, you might not even be able to assume that there’s a human with a heart in control of the wheel.  The stop, look and listen rule still applies – so take those headphones out of your ear.
  • Wearing black at night when crossing the road always has been and still is a dumb idea.
  • People are unpredictable, so keep your eyes open for them when you’re driving.

And I hope we do learn these lessons.  After all, nobody really grieves for a car that gets written off.  However, real live humans have friends and families who will always miss them if they die – and that’s something that a computer or robot system can’t fully understand or experience. http://credit-n.ru/zaymyi-next.html

2 comments

  1. Russell says:

    Apparently these autonomous vehicles have an issue with motorcycles too as there have been reports of accidents involving both.
    Lets hope these vehicles aren’t let loose on our roads until they are sorted 100%…

    March 30th, 2018 at 7:51 pm