Self-driving cars are closer to being a new option of transport featuring another wave of federal guidelines that manufacturers will need to follow. The problem with these autonomous cars is that they are taking some of the responsibility away from the person behind the steering wheel. When there is nobody to blame for things like the Tesla crash in May, it leaves a big question for lawmakers.
When people die in a self-driving car accident, who pays the price? Suing the manufacturer is often the only recourse, and that could take months to resolve. Another question arising in recent times is a moral one, with the decision being argued whether to kill the driver and passengers, or pedestrians if there is no option to avoid an accident. Some might claim the classic Spock adage from Star Trek II, “The needs of the many outweigh the needs of the few or the one.” Others would argue that any death at all is unsatisfactory.
Sadly, that avoids the question of why we feel the need to put control of a motor vehicle in the hands of artificial intelligence. Professor Stephen Hawking warned us of this problem. The common responsibility for years has been that of the person behind the wheel. Allowing drivers to let go of that responsibility practically negates the very reason why we have driver’s licenses.
There is a 3D-printed vehicle in production in Maryland and other big cities being planned as a cross between a bus and a taxi cab, and it’s driver-less. There has yet to be a problem resulting from this, but problems can happen. All it will take is one reckless driver and a glitchy sensor not responding to that driver.
These and more questions are at the heart of what the federal government is attempting to work out. When you take the responsibility out of the hands of the driver, who is responsible? What can be done about it?
Many automakers are already planning to add autonomous features to their vehicles, and lawmakers are looking into how to maintain general safety when those features are in use. The situations in which these features will be made available will allegedly be self-parking, highway lane changing, and remote driving on private property.
However, the regulations on motor vehicles with driver-only controls aren’t enough with many regular drivers still not being careful.
Texting, talking on the phone, dealing with misbehaving children in the back seat, and even driving while inebriated are still happening despite regulations telling the general public to refrain at the cost of heavy fines, jail time, suspended licenses, and more. The latest epidemic of people playing Pokemon GO while driving or even walking across the road is only adding to the problem. Self-driving cars don’t care if Pikachu is in their path and might not be able to stop quickly enough to avoid hitting a gamer who suddenly darts into that path.
2020 appears to be the date at which self-driving cars are aiming to be among us, and regulations continue to be a patchwork. No matter how safe these rules are aiming to make these vehicles, the problem is still letting the driver forego responsibility for what their vehicle decides to do.
Do you think self-driving cars can be made safe for public use, or is it just another potential problem catering to irresponsible drivers?
[Image via Vladyslav Starozhylov/Shutterstock.com]