"We aren't ready!"

Reading the flood of articles about Autonomous Cars recently, you'd be convinced that the human race is at threat from an Alien invasion of 'Cars That Kill'. 

The popular narrative is that Driverless Cars will have pre-programmed reactions to collision situations on the road. And you may not be the prime directive.

"What me?" Nope. Driverless Cars might end-up being programmed to value life. Life in general, not your life, but life in quantifiable numbers.

So when Grandad still driving his 94' Buick Roadmaster veers into your lane on the freeway, your car will make a decision whether it is best to swerve and avoid Grandad, thereby hitting that bus full of tweens on the way to a One Direction reunion concert, or is it best to let Grandad hit you and kill you, because, hey, it's only one human death right?

Well that all sounds pretty sinister until you consider the fact that we, as drivers, make the same exact decisions every time we end up in a situation where a crash is unavoidable. Up​ until now, humans have been given a great big benefit of the doubt.

It has always been assumed that we, mere humans, will always do our utmost best to avoid loss of life in a crash. We will react instinctively. We will react in a way that cannot be judged, because in an extreme circumstance we all do our best. Right?

Maybe not. 

Maybe it's time that an algorithm took the controls. Because at least that algorithm has a pre-programmed mission to save as many human lives as possible. Sure, it may seem callous that a programmer for Chevrolet might be making decisions about life or death, but isn't that better than a random jerk of the wheel and a jab at the brake? Which is what we currently have in place.

I've crashed into another car at 70 mph. and I can tell you that it was not good. But also, what I can tell you is that a driverless car would have succeeded where I failed. 

Because I tried to over-think the situation and I made assumptions about the other drivers' behavior that turned out to be wrong. And I slammed into a 1999 VW Beetle and tore off the right rear wheel, and ripped the front left wheel off my Mitsubishi Eclipse.

A Driverless Car would have handled the situation differently, it would have assumed nothing, and in the first place...it wouldn't have been speeding!!!

So let's all take a breath and realize that, "Yes" autonomous cars will make decisions about life and death. BUT, they will probably be better decisions than you or I might make.

And this leads us to the next question; "Will they choose to let their passengers die in favor of saving the lives of a bus full of school kids?"

Now that's a tricky question. And one that I think, makes something emotive out of a concept that doesn't even exist yet. Will AVs have their own code of ethics. Yes!. Will that code of ethics kill you? Probably not!

Why? Because all of the carelessness, anger, obliviousness, distraction and generally 'bad driving', will be eliminated. Completely.

So for all of you Luddite Warriors out there, I think you can cool your heels and sit back and relax. Because however badly a Driverless Car performs, it will always be superior to a human driver. The End.

by Paul Wynne

D R I V E R L E S S  E A R T H


Issue 019 May 28th, 2016

The Weekly Autonomous Vehicle Magazine