Driver's education class in high school was mostly a waste of time. The coaches who taught it were only interested in flirting with the attractive girls. And if a guy wasn't on the football team, he was just another blockhead.
One lesson I quickly learned after receiving my driver's license was to check the rear view mirror before stopping at a traffic light. At first I did what I had been instructed, stop when the light turned yellow, but then I was rear-ended by a tailgating driver. Around the same time my father told me of a food delivery truck which was approaching a traffic light when it turned yellow. The driver glanced in his mirror only to see a tractor trailer right behind him. The delivery truck driver ran the red light, followed by the tractor trailer. It so happened that a police officer was monitoring the light and made them both stop. The police officer let the delivery truck driver go without even a warning and gave the tractor trailer driver a ticket. So from then on, I check the rear view mirror and stop only if there is no tailgater behind me.
When I first visited Southern California many years ago, I was shocked at the behavior at traffic lights. Not only would drivers not stop at yellow lights, 3-4 drivers would travel through the light after it turned red.
Given the large numbers of people from other states in the Denver area, especially from California, I don't stop at yellow lights nearly as much as I used to.
The many accidents Google's self-driving cars have had are no doubt due to their designers not taking real-world conditions and behavior into account.
Mercedes-Benz announced that its precious self-driving car would run over a child rather than risk the passengers riding inside the vehicle.
Let's do the critical analysis Mercedes-Benz neglected to do. The likelihood of a child running in front of a vehicle on an interstate highway approaches zero. The only time this would happen would be on a residential street with cars parked on one or both sides. These roads always have a low speed limit, no higher than 35 mph and often only 25 mph. Avoiding a child by side-swiping a parked car would result in the passengers being subjected to far less than even the top residential speed of 35 mph because the side-swiping would absorb much of the energy. And as long as the passengers were wearing seat belts -- I assume regulators have the integrity to mandate that Mercedes-Benz and other manufacturers not allow a self-driving vehicle to travel without all passengers being buckled in -- not only would the passengers survive such a crash, they wouldn't even suffer serious injuries.
And that ignores what experienced drivers do when traveling on a residential road. They look for children playing on the front lawns of homes, because one of them might run onto the road. They look for bouncing balls which would imply that a child might not be far behind. And they certainly don't text while driving because it has been proven to be worse than driving while under the influence of alcohol, with deaths from texting while driving surpassing the number of deaths for driving while under the influence, at least for teens.
The technology simply isn't mature enough. The fatal accident involving a Tesla was caused by the sensor being blinded by the bright sky and not able to detect that a white tractor trailer was in its path. The logic was also at fault because it didn't have the simple common sense that human drivers have, as seen every day on highways when people are blinded by early morning or setting sun, when drivers simply slow down.
One alternative would be to design vehicles to not travel when the driver is using wireless communications of any kind, though that would be a difficult problem to solve.
Mercedes-Benz and like-minded manufacturers must be barred from putting self-driving vehicles with sociopathic logic onto residential streets.