Archive

Archive for June, 2012

Don’t be Moral: Why Does Google Consistently Deflect Questions of Machine Ethics?

June 30, 2012 Comments off

“As robots become more autonomous, the notion of computer-controlled machines facing ethical decisions is moving out of the realm of science fiction and into the real world.”

Morals and the Machine,” The Economist

I was invited to a small dinner recently with representatives of Google and Google-backed advocacy and lobbying groups.  This was probably a well intentioned idea, but dinner conversation soon reverted to a quick review of the canned excuses for piracy that we are all familiar with (see the MTP “Canard du Jour” series.”

One canard goes to the heart of the YouTube case–the YouTube interface is “automatic” and Google has no responsibility for what is done with their technology.  This precept has always seemed to me to be faulty–some human created a machine that does things automatically or on a low level in the case of YouTube, autonomously.

Therefore, the designer and in this case the operator of the machine is trying to shirk responsibility for the actions of the machine she created.

My reaction to that is that the operator of the machine is responsible for the acts of the machine when operated the way it was intended.  (Note: don’t start about guns don’t kill people, etc., YouTube is acting the way it is designed and it is doing what it is supposed to do–if you doubt that, try getting an infringing file removed–not a link disabled, not “unmonetized” but actually removed permanently from YouTube–not a design flaw but a feature.)  I wish this was something that was not taught in the first week or two of law school so I could claim some credit for having a brilliant thought.  Alas, the principle is so well-trodden and clearly stated that you would have to be extraordinarily uninformed or naive to think that you would win this argument before a judge.

Joel Goodsen: Your honor, it’s not my fault–the machine I created did it all by itself!

Judge: Take off your sunglasses and sit down, Mr. Goodsen.

Now this seems obvious, doesn’t it?  But yet these erudite digerati trotted out this red herring and paraded it around the table.  As a famous man once said before doing God’s work, “Well allow me to retort!”

How can anyone avoid responsibility for their actions by creating a machine that does what they tell it to do?  Is this really a principle that we want to elevate in society given what will surely be the onslaught of drone surveillance and drone warfare, cyberattacks and hacking by machine?  Military planners are coming to grips with these issues right now and the same companies that plan on profiting from government contracts may be trying to shape public opinion in preparation for profitable drone surveillance contracts to build and operate eyes in the sky–possibly networked to all the information that Google, Facebook et al have been collecting about us for years and that they definitely plan on collecting in the future.  And profiting from.  And sharing with God knows whom.

And remember–Google and Facebook aren’t subject to Title 10 and Title 50.  What they do with the information they collect?  Well, it’s just machines doing it, you see.  Don’t get all moral about it.

So naturally they would not want moral judgements to cloud machine efficiency laid on the alter of the Great God Scale.

The Nevada Autonomous Vehicle Regulations

Let’s take a look at the regulations that apply to Google’s driverless cars, or as the State of Nevada refers to them, “autonomous vehicles”:

[A] person shall be deemed the operator of an autonomous vehicle which is operated in autonomous mode when the person causes the autonomous vehicle to engage, regardless of whether the person is physically present in the vehicle while it is engaged….For the purpose of enforcing the traffic laws and other laws applicable to drivers and motor vehicles operated in this State, the operator of an autonomous vehicle that is operated in autonomous mode shall be deemed the driver of the autonomous vehicle regardless of whether the person is physically present in the autonomous vehicle while it is engaged. (emphasis mine)

Well, no kidding.  Wouldn’t it be wonderful if this was some received wisdom that struck as though a great epiphany like Paul on the road to Damascus?  But no–it stands for the principle that you can’t avoid liability by saying that the the machine did it–that the machine you created was operating in the way you designed it, but the user engaged the technology.  Stunning, I know, but there it is.

Does it bear repeating that these Nevada regulations are the rules Google agreed to in order to test its driverless cars?  Or does it bear repeating that Google has no liability reserve for the product tort exposure they have just taken on?  Probably not, since the insiders control the company on a 10 to 1 so it doesn’t really matter what any stockholder says.

The Spring Gun Cases

The use of machines to do one’s dirty work has never worked out too well.  Legend has it that the final causal link that incited the 1775 Gunpowder Incident in colonial Williamsburg was a spring gun set by the British to protect the Williamsburg magazine.  Two were wounded by the spring gun and the ensuing riot caused Governor Dunsmore to flee the city and declare Virginia to be in rebellion.

But the case that every first year law student encounters within days of starting their Torts class (unless taught by a pamphleteer) is Bird v. Holbrook, 4 Bing. 628, 130 Eng. Rep. 911 (1825), also known as the Spring Gun Case.  Among other things, Bird stands for an important, some might say crucial, principle of the Common Law expressed in Judge Burrough’s concurrence that “No man can do indirectly that which he is forbidden to do directly.”

Don’t Be Moral

When the Googlers and near-Googlers wanted to ignore Judge Burrough’s admonishment–he is, after all, so old and never tweeted–they seemed genuinely stumped by the proposition that you can’t do indirectly that which you cannot do directly.  That this was a surprising or difficult concept seems itself to be really surprising.

For example, if Mom tells you that you can’t have any kale chips, the fact that you train your yellow lab Chuck to get them for you doesn’t mean that Mom will let you have them.

If Dad tells you that you have to go to Stanford instead of Cal, the fact that you train a machine to fake Stanford letterhead on your acceptance letter to Cal doesn’t mean you get to go.

You get the idea.  These examples are a bit humorous to call attention to what should be a simple proposition, but it’s not like I think that no one at Google knows this.  Clearly they must.  It’s that they also must think that the rest of the world is either so stupid that they won’t understand or are so wowed by technology that they’ll forget, or have some other reason for thinking they can get away with this “the machine did it, not me” kind of reasoning.

So whether it is a root supernode, YouTube or a driverless car (which is not “autonomous,” an unfortunate term used by the Nevada statutes) the fact that the machine is designed to do indirectly that which can only be done directly with substantial liability does not get the machine’s operator out of the liability exposure if the machines go wild.

This is, of course, a simple extension of the concept of taking responsibility for acts of free will, a societal norm that starts somewhere around the burning bush on Mount Herob and continues on a more or less straight line to the Nevada driverless car statutes.

Some have tried to pass off concern about these free will choices as “moral panics” (even some pen pals of Andrew McLaughlin).  You can understand why a company like Google would want its employees, consultants and fellow travelers to be out beating the blogs about how moral judgments should not be taken into account when considering activities online.  Or, to paraphrase Google’s corporate motto, Don’t Be Moral.  Google should not be surprised that their moral compass should be taken into account in judging their actions because they put good and evil into the debate nearly from the first day of the company’s existence.

By perpetuating a motto of “Don’t be Evil”, but devaluing that moral judgement when it comes to intellectual property theft as “Don’t be Moral,” Google essentially devalues human interaction with the Internet to a machine like process.  As Jaron Lanier wrote in the New York Times:

Clay Shirky, a professor at New York University’s Interactive Telecommunications Program, has suggested that when people engage in seemingly trivial activities like “re-Tweeting,” relaying on Twitter a short message from someone else, something non-trivial — real thought and creativity — takes place on a grand scale, within a global brain.  That is, people perform machine-like activity, copying and relaying information; the Internet, as a whole, is claimed to perform the creative thinking, the problem solving, the connection making. This is a devaluation of human thought.

Consider too the act of scanning a book into digital form. The historian George Dyson has written that a Google engineer once said to him: “We are not scanning all those books to be read by people. We are scanning them to be read by an A.I.”  While we have yet to see how Google’s book scanning will play out, a machine-centric vision of the project might encourage software that treats books as grist for the mill, decontextualized snippets in one big database, rather than separate expressions from individual writers.

 I think it’s pretty obvious that Google would have a difficult time explaining their massive infringement of the world’s books if there was any moral component to it.  That’s probably why they have a digitizing work force kept separate and apart from other Googlers with instructions to call security if anyone speaks to them.  (See “Epsilons at the Brave New Googleplex.”)  That’s not the act of someone who is proud of what they are doing, or who feels that what they are doing is just business.

It’s the act of someone who knows that the Spring Gun Case is still good law.  Particularly because they just agreed it is when they launched their remotely operated cars in Nevada.

See also: Google refuses to rule out face recognition technology despite privacy rows; Google Acquires Facial Recognition Technology Firm PittPatt; Why Facebook’s Facial Recognition is Creepy; Google to track ships at sea including US Navy; If we could waive a magic wand and not be subject to US law that would be great–Sergei Brin.

Election Calendar: Georgia July 2, Tennessee July 3 Voter Registration Deadlines

June 29, 2012 Comments off

Remember–if you’re not registered, you can’t vote.  If you are registered, make sure you know your polling place as well as your early voting or absentee ballot deadlines if you’re going to be on the road on election day.

Georgia’s voter registration deadline is July 2 for the July 31 primary elections.

Tennessee’s voter registration deadline is July 3 for the State Primary General Election on August 2.

Check if you are registered at Can I Vote?, ask your bandmates if they are registed, too.

Is Android the Joe Camel of Privacy?

June 29, 2012 Comments off

Author @RobertBLevine_ at the Global Forum during Canadian Music Week

June 28, 2012 Comments off

An excellent presentation by Robert Levine (author of the important book, Free Ride) on the failure of the promise of the Internet and what is to be done.  Robert spoke at this year’s Global Forum hosted by Music Canada during Canadian Music Week.

See also: The Wrong Flow: Rightsflow wants you to fix their data

In Which: @AndrewOrlowski Catches Google Calling Up the “Useful Innocents”

June 26, 2012 Comments off

Yes, when sleeping with the enemy is not enough, it’s time for fear and astroturfing in the UK.  The resilient Andrew Orlowski writes in The Register (“Google Orders Spontaneous Support for Parliamentary Motion“) that a former staffer for UK Prime Ministers Tony Blair and Gordon Brown, and–wait for it–current Google employee sent an email “call to action” to leaders of the UK-based “consumer rights” groups Open Rights Group and Consumer Focus.

MTP readers and followers of Google’s influence on the UK government should not confuse the former staffer, Theo Bertram, with current UK Prime Minister’s aide Steve Hilton.  No, no, not the same guy.  Steve Hilton is married to Rachel Whetstone, Google’s Senior Vice President, Communications and Public Policy.  Different, you see.  One actually works for Google and gets the stock options directly, the other is married to someone who works for Google who gets the stock options by marriage.  Thank goodness for that, it almost sounded like a conflict of interest there for a minute.  Whew.  (MTP readers may also remember the discovery of the Creative Commons 2008 Form 990 Schedule B with an interesting list of donors–that’s the Schedule B that these groups never ever ever disclose.)

So Mr. Not Steve Hilton apparently sent an email “call to action” to certain “consumer groups” that I’m so sure that neither Mr. Hilton nor his spouse nor anyone at No. 10 had any idea was going out:

…[T]o rally supporters behind a Parliamentary Early Day Motion in favour of the controversial Hargreaves Review into IP and Growth. Quickly dubbed the ‘Google Review’, the review was launched by Prime Minister David Cameron in November 2010 at Silicon Roundabout, where Cameron cited Google’s difficulties with UK copyright law as the cause for the review. The quote Cameron attributed to Google has never been found, and critics of the review have found the economic basis for the proposed changes to the UK’s copyright laws highly questionable.

Or as some might say complete…well, complete bunk.

So let’s see where this story goes.  Good thing the rest of the British press is focused on the influence of an American multinational corporation on the country’s legislative process and screaming it from the headlines.  You’d hate to think that only The Register was on the ball, right?

Remember, Mr. Bertram, Bobby put the cash in brown paper bags and he’s practically a saint.  If you want to play that game, which you clearly seem to want, that’s how you get it done, son.

Don’t be Deceptively Evil: Is YouTube a honeypot for data collection?

June 26, 2012 1 comment

Remember the righteous indignation of parents over the use of the deceptive cartoon figure “Joe Camel” in the Camel cigarettes advertising campaign to sell cigarettes largely targeted at kids?  The deceptive ad campaign that used a cuddly cartoon figure that was strangely infantile yet a cross between Hugh Hefner and James Bond in terms of setting and context of the cartoon campaign?  A campaign that was designed to attract kids to smoking by deceving them into thinking smoking was cool at a time that cigarettes were required by law to post ever more stingent health disclosures that essentially said “If you smoke, you will die.”  And that did not say, but you can have a cute cartoon character do your dirty work.  The government essentially said, “Don’t be deceptive.”

MTP readers will remember that we have beat the drum about how Google will make “non display uses” of data associated with music and videos that they use as part of YouTube, Google Music (or as it is called this week, the creepy “Google Play”–Play as in “playtime”) and any other legal or illegal music services.  This has never been more apparent than now given the revelations about Google’s consolidation of its privacy policies–combined with a rumored home entertainment center which no doubt will try to integrate Google Music/Google Playtime with Google TV.  Do you want your mp3 player or big screen TV reporting your family’s listening and viewing habits to the Joe Camel of Privacy?  Or maybe just the last four digits of your child’s social security number?

The ever-transparent (or rather never-transparent) Google did not want to testify about its privacy policies before the House Energy & Commerce Committee but instead wanted a closed-door meeting with Members.  Google offered this up in record time after receiving a letter from a number of members of the Energy & Commerce subcommittee chaired by Congresswoman Mary Bono Mack, widow of Sonny Bono, who was a favorite whipping boy of Poker Prof Lawrence Lessig whose organizations have received millions from Google.  Coincidentally, that letter was co-signed by Congresswoman Jackie Speier, who represents the San Mateo district near Google’s Mountain View offices where Lessig once planned to run.

Of course, Google doesn’t want anyone, particularly not the government, to know what they are doing after the Google Drugs debacle, so it is not surprising that the meeting was unsatisfactory.  “‘At the end of the day, I don’t think their answers to us were very forthcoming necessarily in what this really means for the safety of our families and our children,’ Bono Mack told reporters after the closed-door briefing.”

Don’t be surprised to see a new tussle break out over whether Larry Page should testify about this, too.  And that will be interesting indeed.

However, in the meantime, everyone dealing with YouTube or Google Music (or I guess it’s called Google Play or is it Google Playtime?) should understand the real value of music to Google–as a honeypot for data collection that attracts a wide demographic group, starting with teens and kids.  This is why Google’s “Android” is the Joe Camel of privacy.  Now about that Google Playtime Home Entertainment Center….

(Connecting the dots:  This was the same Jackie Speier who Lessig briefly planned to challenge in the Congressional primary after Tom Lantos died in office–a bid for public office backed by Beth Noveck who was later the White House Chief Technology Officer for Open Government and who left the White House after former Google lobbyist Andrew McLaughlin was punished for having improper communications with Markham Erickson (Net Coalition lobbyist), Alan Davidson (Google lobbyist and former Center for Democracy and Technology, Ben Scott (Free Press) and others.  Jackie Speier was also an aid to Congressman Leo Ryan who was murdered in the Jonestown massacre, and Speier herself was shot five times and had to wait nearly a day before she received medical attention.)

Send in the Clowns Part Deux: Nailing Down Privacy Issues on Google’s Government Apps

June 25, 2012 Comments off

MTP readers will remember the dynamic head of Security for Google Apps, Eran Raven a/k/a Eran Feigenbaum:

In October and November 2007, Raven was one of ten mentalist contestants on the primetime NBC series Phenomenon, which was hosted by Tim Vincent and judged by Criss Angel and Uri Geller.

Isn’t that just too awesome?  Criss Angel and Uri Geller.  Now there’s a twofer.

So what is interesting about this is that Mr. Raven-Feigenbaum’s role at Google appears to be placing him in direct contact with information flowing through U.S. Government agencies–through apps.  How could that be, you say?  How could Google get its hands on U.S. Government data?  Wouldn’t the government’s privacy policy be called–classified?  Requiring a security clearance, not a nailgun?

SafeGov.org has an interesting take on how Google’s new privacy policy opens up government data to…Google.

…[T]he more important question raised by [Google's] new privacy policy, in our view, was whether it is compatible with the growing adoption of Google Apps for Government (GAFG) by Federal, State and Local governments. As consenting adults, consumers arguably have the right to let corporations track their web activity and data mine their content in exchange for the privilege of using a valuable computer service at no monetary cost. But when a government agency contracts and pays for the same service, one wants to be certain that it is a safe and secure repository for government data. The idea that the cloud provider is still entitled to exploit user content and web behavior for advertising purposes – as the Google Privacy Policy explicitly allows – remains controversial.

SafeGov.org raised the issue of the privacy policy’s impact on government users in a statement issued on our web site. To its credit, Google immediately reacted by agreeing with us.  Google VP of Enterprise Amit Singh told The Washington Post and other publications that “enterprise customers” who use GAFG have individual contracts defining how Google could handle and store their data. These enterprise contracts, he insisted, “have always superseded Google’s Privacy Policy….”

Unfortunately, it now appears that Google’s assertion that its government contracts “supersede” the privacy policy may not entirely accord with the facts. We have recently discovered a certain number of published GAFG contracts not only contain no language stating that they “supersede” or in any way invalidate the privacy policy, but actually point directly to the policy on Google’s web site and explicitly incorporate it into their text.

Technewsworld highlighted the problem:

Google’s consumer privacy policies may make some users squirm, but those policies could be downright unacceptable if applied to government workers who use Google services thanks to the company’s contracts with public institutions.

So it looks like government work might end up passing through Google Apps subject to Google’s consumer privacy policy (which basically allows Google to slice and dice the data pretty much at will).  According to Computerworld, “…Google government contracts in [and apparently with the States of] Illinois. California and Texas that clearly appear to be governed by the general consumer privacy policy.”

Which sounds like security for any apps involved would be in the capable hands of Mr. Raven.

Thank goodness he has it all nailed down.

Proof of Life: Professor Danaher on the Success of Graduated Response

June 25, 2012 Comments off

Professor Brett Danaher, the Wellesley economist, gave an excellent presentation on France’s HADOPI graduated response program at the Global Forum that I was honored to moderate in Toronto during Canadian Music Week.  Professor Danaher’s independent econometric study shows that HADOPI has been quite effective in educating the French public, slowing piracy and increasing sales.  While there are many differences between the U.S. Copyright Alert System and HADOPI (which no doubt will be misquoted frequently in coming days), it seems clear that doing something is vastly better than doing nothing. 

As usual, the Global Forum presented some of the most thought provoking information and discussion that is available at any conference and was not the usual “betcha it’s free/betcha it ain’t” mudslinging that passes for thought at so many of these things.  MusicCanada and Canadian Music Week are to be commended for sustaining this dialog over the years.

Example of “New Boss” Contracts: The YouTube covenant not to sue

June 22, 2012 Comments off

In case you needed some proof, here is an excert from a “New Boss” contract, being Google’s indie publisher license for YouTube (Google’s heavily subsidized and dominant entry into the video and video advertising verticle).

To my knowledge, there has never been a comparable provision in an “Old Boss” music publishing license or recording agreement. Clauses like this define onerous, over reaching and misuse of bargaining position. Why does Google require it? Because they can. And because they will most likely take the music anyway requiring indie publishers to send them DMCA notice after notice after notice after notice after…

The clause seems designed to limit indie publishers from joining the current YouTube class action, a class certification issue Google recently lost in the Google Books case. In particular, it requires a waiver of any past infringements, prohibits the publisher from “joining in” to litigation (such as a class action or other group of joint plaintiffs), and might preclude a publisher from suing a “User” (including a YouTube “premium partner” like the shadowy Maker Studio, presumably) for any infringement if the user also posted the offending video on YouTube.  Boy, do they hate it when you organize–hence the covenant not to sue applying to “agents and representatives” which was the losing argument that Google made in the Google Books case to try to force authors and photographers to sue Google work by work and not be represented by the Authors’ Guild or the American Society of Media Photographers.

Quick: How many sync licenses have you ever seen with a covenent not to sue?  Waiver of injunctive relief, absolutely.  Arbitration clause, maybe.  Limitation on indemnity, definitely.  But a covenant not to sue?

Covenant Not to Sue. In consideration of Google’s entering into this Agreement, effective upon the Effective Date and through the end of the Term, Publisher covenants and agrees, for itself and its respective agents and representatives, solely with respect to the shares of Compositions owned or controlled by Publisher (and no other shares) not to bring, assert, pursue, maintain, join in or directly and/or indirectly support, assist, fund, lend resources to, or otherwise participate in any litigation, throughout the Territory, involving or asserting any claim based upon or alleging any form of copyright infringement arising from Google’s exploitation of the rights licensed by Publisher to Google herein through the operation of the Google Services, and in accordance with this Agreement, that Publisher has, had or may have against Google prior to the Effective Date or during the Term; provided, however, that such covenant not to sue shall apply to and be binding on a third party solely to the extent Publisher has the legal authority to grant the covenant not to sue herein on behalf of such third party without violation of any legal or contractual obligation to such party (such as, without limitation, an agreement with a songwriter, composer, rights collecting society, co-publisher or sub-publisher), and further provided, however, that this covenant not to sue does not extend to any legal claim arising from the public performance of any musical work on the YouTube Website or elsewhere. Subject to the following sentence, the foregoing covenant not to sue does not include any legal claim to be asserted against any party other than Google, including, without limitation, (i) any legal claim against any third party arising from the incorporation or use of a musical work in a Video or failure to pay royalties or other consideration due to any publisher or copyright owner for use of a musical work in a Video; or (ii) any legal claim against any third party (including a User) arising from the incorporation or use of a musical work in a Video (regardless of whether such Video is a User Video). Notwithstanding the preceding sentence, Publisher covenants not to sue any User who synchronizes any Publisher Composition in a Video uploaded to the YouTube Website, to the extent Publisher’s claim is based on the alleged infringement of rights granted by Publisher to Google herein solely on the YouTube Website.”

How the New Boss Treats Indie Audit Rights

June 21, 2012 Comments off

I happened to attend a panel recently about monetizing content on YouTube.  There were two panelists from YouTube and one from a digital distributor.  The distributor rep did a great job of describing how a large digital distributor dealt with montetizing video content on YouTube.

Given that the back of the envelope ratio I’ve developed from many conversations with rights holders, I currently estimate that “montization” on YouTube means earning about $2,000 per million views.  So make of that what you will.

This was an excellent opportunity to raise a burning question that has repeatedly come up about the YouTube indie publisher license–no one can find the audit clause.

Now an audit clause is a fundamental part of every license.  Once you get your statements, that’s just the first step.  The purpose of having the audit clause is so that the person trusting their records or songs to the company doing the exploiting has the opportunity to check if they’ve been paid correctly by looking at what’s behind the statements.  You know, trust but verify.

I listened to the whole presentation from the YouTube representatives and heard one thing of interest and didn’t hear anything about the audit clause.

They kept referring to “monetizing”.  As in, you don’t have to monetize your “content”, you can claim it and say you don’t want to monetize it.

But what if you claim it, don’t want to monetize it, and want it removed–what then?  DMCA notices again and again and again.  So wouldn’t you want to monetize it?  (This is called the notice and shakedown.)

What that means is that you can leave it up there, not monetize it, and it will just continue to draw traffic to YouTube.  Which is pretty much the only game in town, even for Vevo which depends on YouTube for the vast majority of its traffic.  This is because Google has subsidized YouTube since it bought the company in 2006 so it can extend its monopoly in search to videos.  That is pretty much done now.

So what about that audit clause?  Since it didn’t come up, I asked a question about it.

“I looked, and looked, and looked and looked at your indie deal for an audit clause and I didn’t find it.  Is that because it’s not there, or because I missed it?”

Nope, it’s not there.  When asked to explain why every other rights holder in the room had an audit clause but YouTube did not, I was told it was because it was too much trouble for YouTube to be audited by a bunch of little companies.

Now that’s an interesting way to say it.  Not that it was too much trouble for YouTube to be audited, but that it was too much trouble to be audited by indies.  Now where have we heard that kind of thing before.

But it’s all OK because all they would do is give you your statements which you could get anyway and they won’t let you look behind them the way it has been the tradition in the music business for 100 years.

In other words, trust me.  And the main reason you should trust me is because I’ll take your music whether you trust me or not.

Cool.  Meet the new boss.  Worse than the old boss.

No major label or publisher would have the balls to look you in the eye and say I don’t care about your rights because you’re too small.  You know who says that kind of thing?

A monopolist who knows nothing about the business they are trying to screw over.

See also: Example of New Boss Contracts: The YouTube Covenant Not to Sue

Follow

Get every new post delivered to your Inbox.

Join 494 other followers

%d bloggers like this: