Human factor absent in driverless cars


I’ve always been fascinated with fictional high-tech cars or vehicles with special features, like “Chitty Chitty Bang Bang’s” flight or the super helicopter, Airwolf, and it’s fancy weaponry and speed. It seems like in the 1980s, there were countless movies or TV shows that included some kind of supped-up vehicle that talked, flew, traveled through time, transformed or did something equally as cool.

I remember when I was in high school, I spent a good many Friday evenings glued to the television to watch the NBC science fiction show, “Knight Rider.” The program focused around the lead character Michael Knight, a former police officer chosen to fight crime with the help of a tricked-out, high-tech T-top Trans Am called, “KITT.”

Equipped with, among other things, super speed and artificial intelligence, KITT was more than a self-driving, snooty-accented sidekick. The car was also, on occasion, Knight’s conscience. Given human-like emotion and life-preserving program directives, the car made emotional decisions and conveyed an understanding of the repercussions of its actions and those of its driver.

Some 30 years after the fictional automation of the Knight Rider super car, driverless vehicles are becoming more of an everyday reality each year. Self-driving cars may not have KITT’s super powers, but they are proving that the technology is sound and improvements are coming at a staggering speed.

But, as so often happens in science fiction stories, the technology may be advancing faster than our wisdom. No matter how good it gets, it is unlikely that any artificial intelligence will be able to duplicate the human intuition and the desire to stay alive balanced with compassion for others.

Around the same time that I was watching Knight Rider’s first run on TV, I was also learning about the perils of professional truck driving in my family’s business. One of the first lessons was about responsibility and the deep understanding that I was driving a massive weapon and, should the situation arise, I was to sacrifice my truck and myself to keep others from harm.

That was a lifelong lesson and it doesn’t just apply to driving larger vehicles but anytime I’m behind the wheel. No matter the size of the machine, we are charged with a level of responsibility that no high-tech automation can ever replace. We may be able to create cars that “think” but they’ll never feel or care.

While it may be possible to teach a computer how to navigate the roadways in a reasonably safe manner, there is no way it will ever be able to account for the human factor. People want to live and they have empathy for others.

Given an unthinkable situation, would an automatic car be able to decide, in an instant and with no other options, between rear-ending a carload of children or swerve to miss it only to run over a toddler standing on the street corner with his mother? Could a human driver make any better of a choice here? Would you be able to pick one path over the other in a split second, choosing life for many?

Automated vehicles might be practical someday in the very near future, but I don’t know that I want to take a driverless car to the airport or send one to pick up my kids from school. There is just no way to synthesize the human element and the emotional understanding of the value of life.

Having been in more than a few potentially deadly driving situations in my life, I can confidently say that no one could know the answer to these questions until they’re in the situation. Even then, instinct, fear and a dozen other emotions come into play, making for an incredibly impossible, no-win situation.

Computers may be more accurate, but people have a heart and a conscience. Regardless of how flawed we might be, I’d still rather have a human being in control than a machine.
Deer in Headlines

By Gery L. Deer

Gery L. Deer is an independent columnist and business writer. Get the podcast of Deer In Headlines at

No posts to display