My first project at Martin Marietta, now Lockheed Martin, was Autonomous Land Vehicle (ALV), an eight-wheeled truck chassis controlled by artificial intelligence (AI). The government agency sponsoring the work was Defense Advanced Research Projects Agency (DARPA), the lead R&D agency for the U.S. military. My task was to write the main real-time control loop which I will refer to as RTCL because I don't remember its official name. All equipment on the vehicle would be controlled by RTCL in a binary sense, i.e. either on or off. After thinking about it for a while, I realized that it had to include a dead man's switch, a mechanism to stop the vehicle if things ran amok, even though the concept was not mentioned anywhere in the requirements handed down by systems engineering. My dead man's switch was actually quite simple; if RTCL ran through a small number of cycles without receiving input from Navigator, the project's name for the AI which would decide in which direction to point the vehicle, it would stop and safe the vehicle. The dead man's switch wasn't strictly necessary for the first test because a safety officer would walk alongside ALV holding a kill-switch attached to the vehicle via a long cord, but it would be essential for subsequent work when the vehicle was truly autonomous.
At the design review, I noted that the dead man's switch was indispensable, as it would be the only thing preventing an out-of-control vehicle from going on a slow-speed rampage. The systems engineering manager -- a jovial fellow who once remarked that none of the women he dated in Colorado knew how to give a proper blowjob -- exclaimed, "Boy, we sure put the right person on this task," with his group having missed such an important requirement. I joked that the vehicle should talk, like the 1928 Porter touring car in My Mother the Car.
The dead man's switch was not the only surprise. The first time the vehicle traveled down the test track of slightly greater than one mile, it ran right off the road and onto the grass, forcing the safety officer to kill the engine. We quickly realized that there was very little contrast between the well-worn road, the dirt shoulder, and the brown grass. So management decided to oil the asphalt to make it as black as possible and paint the grass green. This gave Navigator sufficient contrast so it could determine where the road ended. Besides, Martin Marietta and the Air Force had already dumped so much hydrazine and other noxious chemicals into the ground that a little green paint wouldn't make any difference (the water taps in the small building next to our mobile homes up the hill from the main buildings had signs reading "not potable," with the area being a SuperFund site).
Shortly before I was booted off the project to make room for much more important people, my manager gave me my performance review. I had the impression he wanted to give me a three-star rating, i.e. average. I thought I deserved much more because I had been the one to think of an important requirement even though I was only a software engineer and not a vaunted systems engineer. I guess he wanted me to volunteer to be the one to jump from a car onto the vehicle and kill the engine. I was later told that RTCL had been used throughout the life of the project, the only software to do so.
Perhaps others on the project did, but I never dreamed of autonomous cars similar to how Philip K. Dick's androids dreamed of electric sheep. But Google did. Autonomous vehicles, now often called self-driving vehicles, still have many of the same problems.
Volvo's North American CEO, Lex Kerssemakers, became angry as his company's self-driving prototype occasionally displayed recalcitrance during a press event at the Los Angeles Auto Show, saying: "It can't find the lane markings! You need to paint the bloody roads here!"
That's because its AI and associated sensors have trouble distinguishing between the road and the shoulder, and between the various yellow and white painted stripes and dashes. It's not a trivial problem. On a freshly paved and painted road, humans can easily drive down the center of the lane, but as the road wears, even they sometimes have trouble. An estimated 65% of U.S. roads are in poor condition, according to the Department of Transportation, with many parts of the country experiencing snow which temporarily obscures the road, not to mention snowplows which mar the surface by removing lane markings and other road indicators. Ford is testing its self-drivers in snow for these reasons, while Google's next testing venue is in sunny Phoenix, Arizona.
Most of the self-driving has occurred in California, where snow is a rarity, so Google and the other companies have not really tested their vehicles in real-world conditions. That has not stopped some people from making outrageous comments, for example, Chris Urmson, director of self-driving cars at Google, who wants his 11-year-old son to never have to take a driver's test, which is a lot like saying that his son should not learn to swim because he'll probably never be thrown into deep water. But then again, given the high percentage of young people who think nothing of texting while driving, perhaps it is a good idea that the next generation never drives a motor vehicle.
I want to see a self-driver running in one of Denver's summer hailstorms, when the pavement is a mix of water and ice, with visibility for man, beast, and beastly inventions sometimes dropping to just a few meters, not to mention the noise and damage that large ice particles can cause. The sensors of a human are protected by the car's exterior, but a self-driver's sensors are at the mercy of the elements. Not to mention the LIDAR sensors having an electronic cow with thousands of small, constantly changing reflections per second.
Another concern is in regards to the tires. Will self-driver developers include code and sensors to scan the tread to determine the type of tires -- highway, rain, snow, sand, mud, or bald -- mounted on the vehicle or will the vehicle figure that out after it starts to skid?
Google vehicles are not close to being ready for prime-time driving, given its recent accident when one of its self-drivers ran into the side of a bus. It got even better a few days later when Google received a patent for avoiding car-bus collisions.
An uncommon, but nonetheless possible scenario would be if a criminal or terrorist steps in front of your car holding a weapon pointed your way. With a normal car you can stomp on the gas and aim directly at him. If he moves out of the way, he'll be too busy to shoot at you. If he doesn't move out of the way, well, he knew the job was dangerous when he took it. But with a self-driver, the car will stop and allow the cretin to shoot the passengers and/or carjack the vehicle. The day the car stood still, if you will, when it becomes your ex-machina.
Google, MapQuest, and the other maps are mostly reliable, but they still contain the odd mistake. All maps used for self-driver testing are custom made to ensure that everything goes smoothly, but that would be impossible for mass production.
Some people suggest that cars should surrender control to the human in ambiguous cases, but that person will usually be taking a nap, texting, reading, watching a movie, or having sex. People riding mass transit already do these things. Good luck obtaining a quick, lucid response.
But the real issue here is money. Google wants to eliminate drivers because it intends to use passengers to feed Google AdSense. The combination of smart phone data, verbal conversations, and destinations will be a goldmine for Google's advertisers.
It's not far-fetched at all. Many people believe that Google is just as bad as the NSA.
"Surveillance is the business model of the Internet," Internet security expert Bruce Schneier said. "Corporations call it marketing."
In 2012, Google was fined $22.5 million for misrepresenting privacy assurances it gave users of Apple's Safari Internet browser via the use of tracking cookies placed on their computers when they stumbled into Google's DoubleClick advertising network.
Jon Leibowitz, Chairman of the FTC, said regarding the $22.5 million fine: "The record setting penalty in this matter sends a clear message to all companies under an FTC privacy order. He went on to say: "No matter how big or small, all companies must abide by FTC orders against them and keep their privacy promises to consumers, or they will end up paying many times what it would have cost to comply in the first place."
In 2012-13, Google was caught red-handed capturing Wi-Fi transmissions as its Google Street View cars traveled around the world. This was not done by accident, it was done intentionally, collecting so-called payload data, which includes names, addresses, telephone numbers, URLs, passwords, e-mail, text messages, medical records, video and audio files, and other information from Internet users. Google paid just $7 million for its actions. It was fined $25,000 by the FCC for "willfully" ignoring subpoenas and delaying investigations.
All of these fines are a slap on the wrist given Google's 2013 earnings of $57.86 billion, the second consecutive year that Google earned over $50 billion in revenue. Leibowitz must not be good at math.
In 2013-14, it was revealed that Google was reading and mining user emails even before users had a chance to read them. Google's legal team desperately wanted to avoid having a single class action suit because then it would be Goliath v. Goliath, instead of mere individuals taking on the 124th largest corporation in the world. Judge Lucy Koh -- the same one who handles Apple v. Samsung lawsuits and LinkedIn lawsuits (here's one regarding excessive emails and here's one regarding blind references) -- eventually gave Google what it wanted and dismissed any notion of a class action lawsuit. Koh, who controls much of the Internet via her rulings, has been nominated to the San Francisco-based 9th U.S. Circuit Court of Appeals by Barack Obama
Early in 2015, in a related matter, Samsung was caught recording the conversations of people sitting in front of its televisions. It was sending conversations to a third-party translator when its voice activation feature was in use. A self-driving car would use voice activation much more than any television.
Later in 2015, despite publicly promising not to do so, Google mined the emails of 40 Million K-12 students, collecting the browsing history and other data for use with Google AdSense. This episode proved just how low Google would sink, given that students are often required to use school-approved services. Google knew its actions were unacceptable because it had signed the Student Privacy Pledge, agreement signed by 254 technology companies. The pledge begins with: "We Commit To: 1) Not collect, maintain, use or share student personal information beyond that needed for authorized educational/school purposes, or as authorized by the parent/student. 2) Not sell student personal information."
It's safe to say that the conversational tidbits Google collects in vehicles won't be limited to, "Are we there yet?"
Google's Sergey Brin said in a 2012 interview regarding the great firewall of China , "I thought there was no way to put the genie back in the bottle, but now it seems in certain areas the genie has been put back in the bottle," yet both China and North Korea have always been very successful at restricting the flow of information.
Brin also advised that the rise of Facebook and Apple, which have their own proprietary platforms and control access to their users, risked stifling innovation and balkanizing the web.
"There's a lot to be lost," he said. "For example, all the information in apps -- that data is not crawlable by web crawlers. You can't search it."
"You have to play by their rules, which are really restrictive," he said. "The kind of environment that we developed Google in, the reason that we were able to develop a search engine, is the web was so open. Once you get too many rules, that will stifle innovation."
Translation: he thinks it's really, really bad if Google cannot make a ton of loot on every part of the Internet. And he's jealous that Facebook and Apple have that kind of power.
But he is correct about one thing. Google, Facebook, and a few other companies need to be taken down via anti-trust law. Google has 55% of the worldwide market for search ads. Its share of the U.S. desktop market is around 65% and its share of the U.S. mobile market is close to 90%. For social logins in January 2015, Facebook has 61% of the market in total, 77% on mobile platforms, 72% on e-commerce sites, and 76% on education and non-profit sites. Pew Research Center found that clear majorities of Twitter (63%) and Facebook users (63%) now say each platform serves as a source for news about events and issues outside the realm of friends and family.
Not to mention that Facebook could be about to [ab]use its position as one of the major purveyors of news to Americans to influence the 2016 presidential election, with some Facebook employees in full agreement after they asked Mark Zuckerberg: "What responsibility does Facebook have to help prevent President Trump in 2017?" In 2014, Facebook manipulated its news feed to research how social media posts affect people's emotions, with none of the participants aware of their part in the test.
"It's one thing for Facebook to A/B test some advertising structure," said Brian Pascal, an attorney and privacy researcher at the University of California Hastings College of Law in San Francisco, referring to internal tests that websites frequently conduct to determine what resonates with visitors. "It's another to tweak their News Feed to manipulate [users'] emotional state."
Zuckerberg famously bought the four houses surrounding his Palo Alto house to expand his personal space, with the sale prices ranging from $4.8 to $14+ million. He has called for many more refugees to be admitted into the U.S. and Europe, though he is not going to open his palatial estate to house any of them. "I hear fearful voices calling for building walls," he said, **referring to Donald Trump who has called for a pause in immigration while ignoring the obvious hypocrisy in his comment. "Hate speech has no place on Facebook and in our community," he also said, referring to those who oppose unfettered immigration and making one wonder just how far he'll go to achieve his goal.
Like Google, Facebook spent some time in court. In 2011, Facebook settled with the FTC, admitting that it "deceived consumers by telling them they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public." EPIC has an entire website devoted to Facebook's privacy violations.
In 2013, Facebook changed its terms to reflect its stance that it has the same rights as LinkedIn and Twitter, with the latter declaring "a worldwide, non-exclusive, royalty-free license (with the right to sublicense)" for photos and other property downloaded to its servers for its own use and for the use of its partners. LinkedIn gave itself the right to "copy, prepare derivative works of, improve, distribute, publish, remove, retain, add, process, analyze, use and commercialise, in any way now known or in the future discovered." And given that LinkedIn and Facebook pages are regularly checked by employers, potential or ongoing, with the lack of one being a grave omission, Americans are essentially forced to accept these onerous terms.
Facebook does not appear to be entering the self-driving vehicle business, but it is entering the drone business. Bringing your author's story full-circle, Facebook has hired Regina Dugan, the former head of DARPA, to lead a new team called Building 8 which will manage drones, lasers, AI, and virtual reality. Facebook's presentation did not include any revelations, but it did hint at body language interpretation. The presentation included the opinion that "virtual reality has the potential to be more social than any other platform," but how can artificial images and sounds ever be as rewarding as actual person-to-person contact? As for AI, Facebook wants us to accept its technology "that allows photos to be searched for and classified by the image context rather than tags, and classifying videos in real time to help navigate the increasingly rich content people create every day," i.e. it will use its arcane algorithms to further isolate people. Slate noted that Facebook is "propelling startups like BuzzFeed and Vox to national prominence while 100-year-old newspapers wither and die."
Perhaps Facebook can use some of that gee-whiz technology to prevent photo abuse, as happened to a St. Louis family who discovered that their photo was being used in advertising in Prague without their permission or knowledge. It's possible, though highly unlikely, that the photo-thief saw the photo while browsing through millions of accounts, but it's much more likely that Facebook's algorithms found the photo because it's cute and placed it on a common page where the photo-thief saw it. In other words, Facebook is equally to blame.
The canary in the coal mine of democracy has already started to shudder. Many Americans, left, right, and quasi-libertarian, have adopted George W. Bush's mantra that everyone is either with you or against you. Reading only articles that reinforce existing viewpoints will only accelerate our decline.
Google's Brin said, "If we could wave a magic wand and not be subject to U.S. law, that would be great," even though he loves living in the U.S. with all of its benefits and protections, having his cake and eating it too. His sentiment is probably shared by management at Facebook, Twitter, LinkedIn, and Apple.
The old Russian proverb concerning corruption, "the fish rots from the head," appears to apply to the entire social media sector.