The Wi-Spy Fiasco: The Plot Sickens Part 2
[Editor Charlie sez: This post appeared on May 3 and we thought it should be reposted in light of the House Energy & Commerce Committee hearing on the FCC on July 10]
Engineer Doe: Does he know the difference between self-incrimination and immunity from prosecution?
Wardriving: Drive, Detect, Defend is a 2004 book authored by a number of engineers and includes analysis of the work by one Marius Milner, also known to the FCC as “Engineer Doe,” who Google apparently wants us all to believe is the “rogue engineer” solely to blame for sliding Google’s Wi-Spy device past the supposedly watchful eyes of the vigilante Google Street View product team. A device that was deployed by Google for years, in hundreds of Google cars, and that sucked down gigantic amounts of data.
According to a review of the book:
Software insides section starts with coverage on the great work of Marius Milner – NetStumbler and its smaller counterpart MiniStumbler. These tools are easy to install, so the coverage is mostly concentrated on the software’s usage. If you were ever snooping around wardriving web sites, you probably know that there is some great software running on Linux platform, but that it isn’t so Windows-like easy to install and use it. I’m talking about tools like Kismet and AirSnort. To satisfy the needs for the general wardriving reader base that is keen into getting all the answers to their questions on one place, the author gives an extensive approach on installing Orinoco and PRISM2 cards for successful co-habitation with Kismet. You’ll just need to copy-and-paste the author’s steps and you’ll see that patching Orinoco drivers to be able to work in the monitor mode was never easier. Kismet is detailed with installation and usage procedures on Slackware 9.1 and Fedora Core 1….
Wardriving is extremely important for the state of wireless security, as it shows how many unprotected WLANs are out there and is therefore directly influencing wireless security awareness. The book will both teach you how to participate in wardriving projects as well as to get familiar on what kind of information outsiders can discover about your wireless networks
Well, there you go.
According to Wired Magazine:
Back in the early 2000s, Milner wrote NetStumbler, a Windows tool that could be used to pinpoint wireless networks….In an audit made public in 2010, Google called its Wi-Fi logging software Gstumbler. In retrospect, that should have been a tip-off….
What got Google into trouble, though, was its practice of indiscriminately logging wireless packets with its Street View cars between up until 2010. Google recorded any data traveling on unsecured wireless networks at the moment the car drove by, which included full e-mails and passwords.
Ironically, Milner’s NetStumbler software wasn’t up to the task of war driving at Google-scale. It simply didn’t do the packet sniffing Google needed to grab information. So instead Google and Milner based the Street View system on a more powerful Linux program, called Kismet, that could. [See Wardriving book review above.]
Google needed more powerful software to log the unique identifiers of routers — even ones with faint signals — to help it build its geolocation services. Doing so does not require logging the contents of the packets.
Street View engineers intentionally stored the content on Milner’s hunch that it could be useful to know what websites people were visiting. That’s what attracted the attention of government authorities….”
In a fascinating and very relevant blog at the Harvard Law School, Christian Sandvig writes in “Confessions of a Spy Car Driver”:
This is a topic close to my heart because my research group has been conducting similar surveys of wireless signals for the past five years as part of a project funded by the US National Science Foundation….This is akin to doing a survey of telephone adoption by counting telephone poles. We can do this research from public streets and sidewalks. We are looking at unencrypted information that is broadcast in the clear to everyone anyway (called the “management frame” — this information is what creates the list of available Wi-Fi access points [on your laptop]. We don’t look at the content of the transmissions….
Google was trying to do the same thing that my wireless research group was doing — again, no ethical problems there. However, they claim to have “inadvertently” also listened to the content of communications. (This is called “payload” data.) Here’s the problem with the story we’re getting from Google: the word “inadvertently.”
I see no way that this could be inadvertent. Continuing my earlier metaphor: If your plan is to count telephone poles how would you “inadvertently” tap telephone lines and transcribe everything that you hear? The two actions are quite different….However, I don’t understand how we could ever have “inadvertently” done that….Even sampling some of the payload data would start producing about 10x as much data as listening only to the management frame. Do you ever go to the store for a can of soda and “inadvertently” fill your cart with ten cans? I didn’t think so.
If you inadvertently started buying 10x as many groceries as you wanted, I bet you’d notice. I bet it would take you less than three years to notice, too.
The only interpretation I can think of is that the word “inadvertently” is being applied by the legal department….Programmers may have set it up on their own initiative and not briefed anyone else who could have seen this disaster looming, but that isn’t “inadvertent.” And it isn’t a “programming error” – another phrase that is being used in the press.
From here Google looks pretty guilty. “
Please read the original post, it is exactly on point, amazingly prescient and provides excellent technical background in considerable detail. And it was written two years ago if anyone had cared to read it.
So again—Mr. Milner seems to have clearly been a leading authority on wardriving with deep expertise directly relevant to the Wi-Spy aspect of Google’s managed driving program. It appears that he is as he was described by a recommender in his Linkedin profile—“a GOD” of WiFi. It also appears that it was unlikely that anyone could have stumbled—so to speak—on Google’s Wi-Spy business “inadvertently”, by accident or on their “20% time.”
The more plausible explanation is that once Mr. Milner’s design document becomes public—which it almost surely will—it will become apparent that the intuitive conclusion that anyone capable of sequential thought would likely draw on further study is this:
Google hired Mr. Milner to design the Wi-Spy program to do exactly what it did, and it performed to perfection. I would not be surprised if the design document reveals that Mr. Milner did not tell his managers at Google what he should do–rather he did what his managers at Google told him to do and he did his job very, very well. And whoever that was who told him what to do is now in a position to marshal Google’s considerable resources to hide the truth and enlist Google’s Palace Guard of unquestioning lawyers to do it.
I would also not be surprised to learn that everyone involved knew that what they were doing was highly illegal and they intentionally kept it away from everyone who could stop them—such as the lawyers. And as Nathan Muir might say, the lies that now have to be true kept getting bigger and bigger and involve a larger and larger group of people.
Paper Corporate Governance
The reason that there were people who could stop the project is because Google evidently has procedures—at least on paper—that protect the corporation, the stockholders, and establish the scope of authority in which employees are to act in the conduct of their jobs. Meaning that if there were a procedure and employees of Google purposely avoided complying with the procedure, they were arguably acting outside the scope of their authority.
As Robert Enderle wrote in an excellent Datamation op-ed that sums up the danger to Google’s customers (like, oh, say the United States Government):
The issue…is that an employee in what is an increasingly complex company could make a decision and implement it without corporate oversight [assuming Engineer Doe acted alone]. This means that decisions like this could not only land Google in hot water, as it clearly did in this instance, but it could land a company using the Google solution in hot water or do avoidable damage.
The typical definition of an enterprise-class vender is one that has the hardware, software, and services needed to serve the largest of companies. They also have to have a track record of doing so successfully, which does create a cart and horse problem with new entries like Google. Often this is initially mitigated by working with an enterprise partner, which is what firms like Microsoft, Novell, RedHat, Citrix and even RIM did to get their stature.
Doing it this way made sure there were always experienced teams in charge of assuring the solution. …Behind this definition has always been an assumed robust process for creating, maintaining, and repairing this class of offering that takes into account the critical nature of enterprise tools and assures these massive companies don’t have catastrophic failures….[A]s a class [enterprise vendors] tend to be far more control-oriented, because the liability, both in terms of cash and in terms of image, is vastly different if a smartphone has a bug or if you shut down Boeing.
[A]nother often missed aspect of a true enterprise vendor is that their board typically has a large contingent of diverse operating enterprises represented….At Google there is only one such board member….The Google board is mostly staffed by investment bankers, not people who have current experience building or selling products in any market Google serves except through their investments.
In fact, when you look at Google’s board, you can imagine they would have governance issues because the board would be more focused on investments Google might make than in how Google was being run.
So it is not surprising that “Engineer Doe”—and, I would argue, his supervisors as well, because I think Google is lying about the “lone wolf” theory—could be personally liable for claims arising out of the Wi-Spy program. Such as violation of the Wire Tap Act as well as a host of other laws in other countries.
And this is the kind of situation where someone might want to invoke their privilege against self-incrimination. Which the FCC tells us is exactly what Engineer Doe chose to do.
There is, of course, a big difference between taking the 5th and being immunized by a prosecutor (or by the Congress). First of all, the 5th Amendment is just that—an amendment to the United States Constitution. As Google’s Sergei Brin recently told the Guardian, he would like to wave a magic wand and not be subject to U.S. law—but he would, of course, still be subject to the laws of the U.K., Germany, France…and so on. (Which makes me think what he really meant was he’d like to wave a magic wand and not be subject to any laws at all. That’s how Google acts, in any event.)
So if Engineer Doe thought he got somewhere by taking the 5th, he may have only made the government’s case harder in the U.S.—we’ll see how he fares overseas. By the look of things, Engineer Doe is going to be spending a lot of time in courtrooms for the foreseeable future.
Which is really a shame. This is a guy who I feel pretty certain was just doing what he was told and now it appears that he is being hung out to dry by his employer. The entire premise of the “rogue engineer” theory is so flimsy it is insulting to the intelligence, and certainly will do nothing to further the career of a clearly talented hacker. On the one hand, if it is true that a “lone wolf” was able to implement a vast program of data theft in 30 countries without corporate oversight, Google is screwed because they are incompetent and should never be trusted with another government contract. If on the other hand Google is lying, they are malevolent frauds and should never be trusted with another government contract and probably should be investigated by the Securities and Exchange Commission, if not criminally prosecuted. In either case, Google’s apologies are meaningless given—to use one of Google’s favorite words—the “scale” of the data theft.
The episode is a good example of what happens when the hacker ethos is adopted by a publicly traded multinational corporation—if the machine is capable of doing something, there is no further need for ethical inquiry by humans. Like so many of Google’s problems, they all trace back to this fundamental flaw in judgment that is often incorrectly called hubris.
It is not hubris. It seems that the Google managers really don’t think there’s anything wrong with what they did because they believe that if they can design a machine can do a thing, humans ought not stand in the way.
There are few societies that do not have some version of “thou shalt not steal” in their shared moral responsibilities to their citizens. Accepting these responsibilities permits society to function. The responsibility cannot be avoided by designing a machine to do the stealing.
Morality is not code—the law requires humans to take responsibility for the machines they design because humans ought not let the machine do a thing that is immoral regardless of its capabilities.
Unfortunately for Engineer Doe, he will be judged by the law of humans and not by the code of machines.