Sunday, February 17, 2013

Don Quixote



Re-read Don Quixote as you follow the debate about revising Europe's privacy laws. Is it more noble to pursue the glory of fantasy over the indignities of the real world? Do we want to defend an obsolete chivalric code, while the rest of the world looks on with derision?  Do we want a strong privacy law that can be operationalized or a glorious piece of literature? 
 
American companies are starting to freak and shriek about Europe's upcoming new privacy laws. In turn, various European politicians are publicly posturing about how all this is required to rein in American companies, while feigning resentment that American companies are lobbying for their interests in Brussels. In reality, of course, the new proposed EU laws are full of flaws, in particular imposing lots of pricey new compliance-bureaucracy obligations, and threatening minor compliance violations with absurdly-high fines in the range of 2% of a company's global turnover. But let's not let reality sully this tale. Don Quixote is defending privacy against the American-mega-corporate-privacy-slayers. Don Quixote is defending the Right to be Forgotten.
 
Sadly, things don't end well for the noble knight, unsettling and unsaid...American companies will come out big winners, compared to their European rivals. European companies face decades of innovation-paralysis under the new rules. American companies will just reorganize and relocate certain operations out of Europe to mitigate risk.
 
Like many people in the privacy profession, throughout my career, I had always thought it was sensible to apply Europe's privacy laws worldwide, in the interests of maintaining one, consistent worldwide standard. I'm changing my mind now. As the proposals to revise the privacy laws in Europe become whackier by the day, I am starting to believe that the "world" will have to watch Europe do its own thing in its own backyard, while maintaining a different, faster, more innovative pace in the "rest of world". Granted, Europe is a market that is just too big to ignore, but that's no reason why special compliance rules for it should be exported globally. No one applies Chinese censorship rules outside of China, so this would hardly be the first time that companies apply special rules in one particular country/region.
 
Europe's proposed rules will end up costing a lot, if you care about innovation in Europe. I'm a technophile, in the sense of believing that fast innovation is the only hope to maintain high rich-world living standards for our aging Western societies in the future. But I am troubled by how many roadblocks are being put in place to drag down the speed of innovation. Don't get me wrong: I'm all for serious privacy ethics, for privacy sensitivity, for privacy by design. But I'm not a fan of privacy-bureaucracy-drag. Europe, as one would expect, developed the world's most extreme form of bureaucracy-drag, when it invented the notion of bureaucratic "prior approval" for new technologies. That means that a new technology is dependent on a bureaucracy's prior approval before being launched. Or prior approvals for international data transfers (how absurd, in the age of the Internet!). Or prior approvals for binding corporate rules, and a thousand other bureaucratic hills and hurdles. Reality, again, is often a rather dis-spiriting affair.
 
Despite all its good intentions, Europe is also giving the world hopelessly vague privacy laws, sometimes enforced with criminal penalties. For example, what does it mean to impose jail time on someone for "processing sensitive personal data without the data subject's consent"? Does that justify jail time for posting a photo to a social networking site, given that a photo will reveal a person's race and sometimes health conditions (all, "sensitive" categories)? I have faced personal criminal prosecutions on flimsier privacy-law grounds than that, so these are hardly hypothetical risks. In short, Europe is making it increasingly risky to pursue innovation in the field of Big Data, in Europe.
 
The cynical realists will see that Europe's innovation-inhibiting privacy laws will simply drive more Big Data and Internet innovation to move increasingly outside of Europe. Will we see companies choose to move their research arms elsewhere, for example, to the US or India or Singapore? Ask yourself whether US or European companies will turn out to be more hobbled by Europe's rules? The answer is obvious: European companies will have to swallow these new rules entirely, while non-European companies can simply ring-fence their slower, less innovative operations in Europe. Companies may end up offering a series of slower, less-cutting-edge services in Europe, given the significant risks that cutting-edge data-services could be smacked with massive fines.
 
I say all this with sadness, as the world moves on. Who am I to deride Don Quixote's dream? Who am I to celebrate the demise of his delusions?

Monday, February 11, 2013

Talking Privacy to the Guys in the Pool


I'm in Florida for a few days, joining the Privacy Law Salon, and a chance to talk about privacy with a lot of experts in the field. But I usually think it's more fun to talk about privacy with the guys in the pool. Ft Lauderdale is the home of the International Swimming Hall of Fame, so it's a change of scene from my usual Paris pool. We don't hang on the walls long, so conversations are short.
   
Privacy is more important than security? Not true. Without security, you drown. You're either being hacked and know it or being hacked and don't know it. Imagine drowning without even realizing it. All of privacy is a wobbly edifice built on the foundations of security. If the foundations aren't solid, then the edifice will crumble.
 
Privacy is contextual: We live in Speedos, but can't wear one to the office. Online screws up context, because it takes data from one context and re-uses in another. People peek, machines record. You can't attribute human motives to a machine, or teach it that it's rude to stare.
 
Privacy is about losing it: We never give a thought to privacy, until it's gone.  Like breathing, you don't think about it, but in a lungbuster set, breathing on stroke 3, 5, 7, by 9 you will explode if you don't breathe.
 
Privacy requires discipline: 6 am, get up, go to pool. People expect anyone who holds their data to have fault-proof privacy, in particular iron-tight security, no excuses, no days-off. But in reality, nothing is perfect and people are only human. Like a cramp in the middle of your swim. You younger start-up guys are faster, but you're half my age. Sure, you can swim 50 free faster, but can you sustain it for a lifetime?
 
Privacy requires transparency: Coach sees your stroke. Privacy should be as transparent as possible. But privacy processing on the modern Internet has become so complicated, technically and in terms of scale, that human brains can scarcely comprehend it anymore. How can I grasp machine learning algorithms, when I can barely count laps? And you're supposed to explain every aspect of online processing to the average user, like explaining a flip turn in words to a non-swimmer?

Privacy is not a team sport: Even if you swim in a team, you still swim alone. Privacy is a social construct about one individual identifiable human being. Nothing in the Age of Big Data is going to change the fact that privacy is about the individual. And conversely, if it's not about an individual, then it's not about privacy. The team doesn't have privacy, it's about each of us individually, just like a team medley is really four individual swims in a row.
 
There's no place called privacy. There's no destination in swimming either, you just go round and round until your mind or body gives up. Most of my work in the field of privacy and technology is like a sandcastle on the beach, washed into irrelevance by the next tide of technology. And yet, I never doubt its importance.

The zone is furtive. A lifetime of work and setbacks, 10K per day, and then for a fleeting moment in the pre-dawn darkness, my mind goes blank and everything disappears except the sensation of an ecstatic wave chasing a vision of the perfect fly.

Monday, February 4, 2013

Why is Bing calling me a "Google Criminal"?



It's always a good idea, from time to time, to search on your own name.  When I searched on my own name, here's what Bing suggested:

Search engines like Bing, offer auto-complete and related-search suggestions.  These help people find what they're looking for faster.  Auto-completion is determined algorithmically, largely based on the search queries that the largest number of searchers have typed in the past.  If you start to search on the term "New York City", auto-complete may suggest "New York City weather" or "New York City subway".  Related search suggestions will show query terms that are most likely to return content to be relevant to the original query term.

The algorithmic principles are the same for searches on individual names.  Use a search engine to start typing in your own name, or any name, and you'll often see auto-complete suggestions that can border on the offensive.  It's therefore a common reaction for some people to say:  I demand that the search engine block this term from searches on my name.  

Take my personal example.  I know that lots of people and sites have reported on my criminal conviction on behalf of Google in an Italian court, for which I was later acquitted on appeal.  Of course, I recognize that search engines are not really calling me a "criminal".  They are not exercizing editorial control over the association.  They are using algorithms to associate my name with what other people have searched for in the past, or with the related search query likely to generate the most number of relevant search results.  The underlying content may just as well be saying:  his criminal conviction was overturned on appeal.  So, I haven't asked Bing to block the word "criminal" from searches on my name.  I don't believe that they should, or should have to, and I'm sure Bing would refuse even if I asked them. 

Over and over again, especially in Europe, I see "privacy" being used as a justification to censor free speech.  The poorly-defined "right to be forgotten" is a much-discussed example.  I don't understand how we could protect notions of freedom of speech, and the neutrality of search engines, if people could decide themselves which terms they did not want associated with their names.  Practically, who would decide which terms were acceptable and which are not?  I think it's very dangerous to try to use search engines to censor search suggestions from reflecting content on the web, or to manipulate the algorithms to prevent them from objectively reflecting what users search for. 

There are a lot of people who don't want to see search engines make common suggestions after their names with terms like "Jew" or "gay" or..."criminal".  In a nutshell, that's the question:  Should some sensitive words simply be filtered from such results, or is that a step too far down the slippery slope of censorship?  

Friday, February 1, 2013

MSFT goes forum shopping to...Luxembourg



Microsoft has very large operations all over Europe, in particular in Dublin, London and Paris.  So, it came as a bit of a surprise to me to learn that Microsoft has forum shopped Luxembourg, as its governing law and lead regulator for the roll-out of its new privacy policy, as reported by Bloomberg Businessweek.  Indeed, as a lawyer, I tried to decipher Microsoft's Services Agreement "13.3. Europe. If you live in (or, if you are a business, you are headquartered in) Europe, you are contracting with Microsoft Luxembourg S.à.r.l., 20 Rue Eugene Ruppert, Immeuble Laccolith, 1st Floor, L-2543 Luxembourg and the laws of Luxembourg govern the interpretation of this agreement and apply to claims for breach of it, regardless of conflict of laws principles,..." 

This runs contrary to the entire ethical premise of a "main establishment" in Europe, built on the idea that the laws/regulators of that European Member State should govern companies where they have their main establishment.  That's why Facebook is operated in Europe under Irish laws and why the Irish regulator is leading the European privacy reviews into it.  Facebook clearly has established its main establishment in Ireland, in terms of governance, headquarters, employees, etc, in other words, in the real world, rather than just a legal mailbox fiction.  

So, could Luxembourg possibly be the "main establishment" in Europe for Microsoft?  Of course not.  Microsoft has forum shopped a tiny European country, for whatever legal, tax, or regulatory advantages it thought it could gain from "locating" there, without of course, "locating" hardly anything there at all.  

I have long supported the need to create the concept of "lead regulators" and "main establishment", in order to bring more efficiency and predictability to privacy in Europe.  But my advocacy has always been based on the belief that the selection of "main establishment" should be based on objective criteria, like having a large workforce and real-world activities located there. 

A shrewd company like Microsoft goes forum shopping and claims that its dealings with nearly half a billion people in Europe are governed by the laws and regulators of the tiny Grand Duchy of Luxembourg.  Blimey.