Today the French Data Protection regulator, CNIL, reaffirmed its position that Google must apply European “Right to Be Forgotten” (RTBF) law globally, by removing content from its services in all countries. Europe’s RTBF laws are rooted in citizens' rights to data protection and privacy. They are inconsistent with U.S. and other countries’ free expression laws, because they require suppression of information even if that information is true and not causing harm. In the U.S., the First Amendment would not allow a court to force a search engine to delist this kind of data. (Europe has its own version of this issue: European law protects speech that must be removed under Russian law, probably including Russia’s version of RTBF. It’s not clear how CNIL or France would react if Russia applied that law to restrict content for French Internet users.)
As many commentators have pointed out, CNIL’s position creates a conflict between the rights of European citizens to privacy and data protection, and the rights of people in other countries to access information. Perhaps ironically, European law typically does a much better job than US law at defining and protecting “access to information” as a right distinct from the right to free expression. France is one of several countries where citizens have asserted a legal right to prevent Internet intermediaries from voluntarily removing content. In a case working its way through court in Paris, a French schoolteacher is suing Facebook for removing his post of a Gustave Courbet painting under its nudity policy. His claim would never survive in an American court: U.S. law could not require Facebook to host nude content posted by users if it didn’t want to. The U.S. approach suggests a greater fear of government censorship, while the EU approach suggests fear and distrust of discretionary private decision-making, perhaps especially on the part of U.S. based companies.
How to reconcile France’s Facebook Courbet case and its Google CNIL case? Are intermediaries subject to compulsory inclusion of content, as well as compulsory removal? In fairness, one case is much more advanced than the other – we don’t yet know if the court will agree that the Facebook plaintiff has a cause of action, though we know it has taken jurisdiction. But they reflect profoundly different approaches to speech on the Internet, and to the power of one country’s law to regulate. If the French court did decide that Facebook must reinstate the plaintiff’s post, would it order reinstatement globally? Or does French privacy/data protection law have different global reach from French free expression/access to information law?
At front and center of both these cases is the question of who decides what information the public can see – and what parts of the world must abide by their decision. This is a hard issue. Lawmakers, courts and companies will all make mistakes. Those mistakes will be much more consequential – we will all bear the consequences – if regulators succeed in compelling Internet companies to remove controversial content globally.
Date published: September 21, 2015