Isaac Asimov, a sci-fi author and professor, introduced the laws of robotics in 1942. One of these laws is called the Zeroeth Law and states that a robot may not harm humanity or allow it to come to harm.
Self-driving cars are a difficult concept to grasp because some driving situations need quick and ethical thinking. If you're driving at night and a child jumps in front of the car, would you continue driving and hit the child or swerve and possibly kill yourself? The car will make these decisions in the near future, and there's no guarantee that it'll choose the ethical one.
These self-driving cars are about more than possibly reducing accidents and getting to work without touching a steering wheel. They must manufacture complex artificial intelligence processes.
Can it be done? Humans are a complicated species. We have morals, rights and original thoughts that have transformed over time. We have emotions that impact our decisions. Can we take everything that makes us unique and trust it with a computer?
Time Will Tell
It's only a matter of time until the engineers at Google master the technology of self-driving cars. Just hope they take the Zeroeth Law into account.
(Note: You can view every article as one long page if you sign up as an Advocate Member, or higher).