Decades ago, either Popular Mechanics or Popular Science included an article that ostensibly described traffic behavior on highways. The author opined that drivers traveled in wolfpacks, i.e. they traveled in large groups with a good deal of space between wolfpacks. He advised drivers to travel halfway between the wolfpacks to avoid the jostling that might result.
The author completely misunderstood what was going on. People most definitely do not drive in wolfpacks. Slower drivers cause a temporary bottleneck delaying faster drivers. It is true that some drivers, for whatever reason, never pass the driver in front of them, with some of them even tailgating. If one could see it from above, one would see that the so-called wolfpacks are constantly dissolving and reforming in different places with different vehicles, with the faster drivers eventually making it through one obstruction only to run into another one.
A related observation can be made by someone looking out of a window of a tall building looking over a highway. Traffic delays do not remain in one place, but actually move down the road in the direction of approaching vehicles, in a classic example of wave motion.
Ford and Jaguar Land Rover have created a technology called Green Light Optimal Speed Advisory (GLOSA). They propose to allow drivers to time the lights, i.e. travel at the proper speed to reach the next traffic light so that it turns green just as they reach it.
There are at least five different types of drivers around our hero vehicle:
1) People who will continue through the next traffic light.
2) People who will turn left at the next traffic light.
3) People who will turn right at the next traffic light.
4) People who will turn left before the next traffic light.
5) People who will turn right before the next traffic light.
If the GLOSA vehicle selfishly slows down to to time the lights, assuming it is continuing through the light, it may delay vehicles from #3, #4, and #5 behind it. If the traffic light is configured to allow left turns before continuing traffic, vehicles from #2 behind it may miss the next cycle.
Given that a GLOSA vehicle will most likely be programmed to not exceed the speed limit, it will almost certainly cause delays given that many people travel faster than the speed limit.
And we may see more road rage incidents from drivers angry that the GLOSA vehicle is driving as if it were the only vehicle on the road.
If all vehicles on the road were self-drivers governed by GLOSA, the system could allow traffic in general to move as fast as possible, but having only some vehicles governed by GLOSA is a wacky idea.
Driver's education class in high school was mostly a waste of time. The coaches who taught it were only interested in flirting with the attractive girls. And if a guy wasn't on the football team, he was just another blockhead.
One lesson I quickly learned after receiving my driver's license was to check the rear view mirror before stopping at a traffic light. At first I did what I had been instructed, stop when the light turned yellow, but then I was rear-ended by a tailgating driver. Around the same time my father told me of a food delivery truck which was approaching a traffic light when it turned yellow. The driver glanced in his mirror only to see a tractor trailer right behind him. The delivery truck driver ran the red light, followed by the tractor trailer. It so happened that a police officer was monitoring the light and made them both stop. The police officer let the delivery truck driver go without even a warning and gave the tractor trailer driver a ticket. So from then on, I check the rear view mirror and stop only if there is no tailgater behind me.
When I first visited Southern California many years ago, I was shocked at the behavior at traffic lights. Not only would drivers not stop at yellow lights, 3-4 drivers would travel through the light after it turned red.
Given the large numbers of people from other states in the Denver area, especially from California, I don't stop at yellow lights nearly as much as I used to.
The many accidents Google's self-driving cars have had are no doubt due to their designers not taking real-world conditions and behavior into account.
Mercedes-Benz announced that its precious self-driving car would run over a child rather than risk the passengers riding inside the vehicle.
Let's do the critical analysis Mercedes-Benz neglected to do. The likelihood of a child running in front of a vehicle on an interstate highway approaches zero. The only time this would happen would be on a residential street with cars parked on one or both sides. These roads always have a low speed limit, no higher than 35 mph and often only 25 mph. Avoiding a child by side-swiping a parked car would result in the passengers being subjected to far less than even the top residential speed of 35 mph because the side-swiping would absorb much of the energy. And as long as the passengers were wearing seat belts -- I assume regulators have the integrity to mandate that Mercedes-Benz and other manufacturers not allow a self-driving vehicle to travel without all passengers being buckled in -- not only would the passengers survive such a crash, they wouldn't even suffer serious injuries.
And that ignores what experienced drivers do when traveling on a residential road. They look for children playing on the front lawns of homes, because one of them might run onto the road. They look for bouncing balls which would imply that a child might not be far behind. And they certainly don't text while driving because it has been proven to be worse than driving while under the influence of alcohol, with deaths from texting while driving surpassing the number of deaths for driving while under the influence, at least for teens.
The technology simply isn't mature enough. The fatal accident involving a Tesla was caused by the sensor being blinded by the bright sky and not able to detect that a white tractor trailer was in its path. The logic was also at fault because it didn't have the simple common sense that human drivers have, as seen every day on highways when people are blinded by early morning or setting sun, when drivers simply slow down.
One alternative would be to design vehicles to not travel when the driver is using wireless communications of any kind, though that would be a difficult problem to solve.
Mercedes-Benz and like-minded manufacturers must be barred from putting self-driving vehicles with sociopathic logic onto residential streets.
Chris Valasek, director of security intelligence for IOActive, opined that the defense against cars being remotely hacked would be "a device you could plug into the car to stop any of the attacks we've done and that others have done." He went on to say that the solution would be as simple as "an algorithm that detects attacks and prevents them."
You mean like anti-virus which does not work against zero-days?
One reason vehicles connected to the Internet are vulnerable is that they have one big system. Once a hacker is in, he's got access to the entire vehicle. Car makers could have designed them with firewalls in between the various systems, but they often didn't.
Vehicles also could have designed with an Internet kill switch, but automaker market research has told them that most buyers want a traveling router. You'd think they'd understand that a significant minority would purchase a unhackable vehicle.
The requirements of brakes and steering for passengers in a self-driving vehicle need to be well crafted, not least because shysters will sue when an unusual accident occurs.
I forgot to mention a few things in my last post.
Each vehicle needs a specialized hammer which would be used in case the power windows and locks are not functional, with the passengers needing to break the glass and depart.
Each vehicle needs a manually operated fire extinguisher for obvious reasons.
And each vehicle needs a means of communications with the vehicle operator and Enhanced 911, but that's probably already in the plans.
As I wrote before, it will be a long time before self-driving vehicles are to the point where they can be trusted in all driving condition. However, the regulators of California have "opened a pathway for the public to get self-driving cars of the future that lack a steering wheel or pedals."
This is world-class incompetent.
Self-driving vehicles must retain three mechanisms -- steering wheel, brakes, and power-off switch -- to allow a human to override the system in case of failure. This is not to say that humans will often use this capability, as most of the time they will be napping, playing games, watching video, or texting, and will not be in the right frame of mind to make last-minute decisions. However, the option must always be available in case something breaks.
The option to override the vehicle must be present in case of hacking because, by definition, the vehicle will always be connected to the Internet and therefore susceptible to malware. It is not possible to guarantee that a vehicle has impregnable defenses against malware, so humans must always be able to take manual control.
To avoid accidental use of the override, there needs to be a switch which must first be activated, with this switch being unlikely to be activated in day-to-day operations. It would be completely unacceptable to require the use of a key, password, or fingerprint because if the switch is needed, it would be needed immediately. Something as simple as a switch with a cover which hinges out of the way would suffice.
The other states should pass legislation to mandate the above to prevent being californicated via the Full Faith and Credit Clause of the U.S. Constitution.