Google has started to remove search results in certain cases in Europe, in compliance with the European Union’s new “right to be forgotten” rules.
The EU last month ruled that the company must allow individuals to request the removal of links to news articles, court judgments, and other documents that might turn up in results when searches are conducted on their names.
The ruling stipulates that Google needs to weigh a person’s right to privacy against the public interest in having specific information about that person available. National authorities can compel Google to comply if they determine that there is not sufficient public interest to justify the appearance of a link.
“The key for Google is, always, how to automate procedures,” Derek E. Bambauer, associate professor of law at the University of Arizona James E. Rogers College of Law, told the E-Commerce Times.
“The firm has extensive experience to draw upon in dealing with removals. The copyright takedown procedures under U.S. copyright law are similar in nature. Google has figured out how to automate that procedure, and I expect it will do similarly with the privacy link removal requests,” he said.
‘No Clear Rule’
“The harder question is when Google complies with removal requests,” Bambauer added. “The European Court’s decision did not provide a clear rule for when the company should delete links, and Google will have to make some difficult decisions. The easiest thing, of course, would be to remove every link that’s requested for deletion, but that would lead to massive overdeletion, and would probably make Google less useful as a search engine.”
Company engineers modified the search engine overnight to start making the removals, according to The Wall Street Journal, and Google started sending emails to users to let them know their link removal requests had been carried out. It is not clear how many requests have been fulfilled so far, but the number reportedly is small.
Google had received more than 41,000 removal requests as of a month ago, through a Web form it set up after the EU’s ruling.
‘Inadequate or Irrelevant’
Google stipulates on its website that for results to be removed, they need to be “inadequate, irrelevant, no longer relevant, or excessive.”
The company assesses each request individually, and so it will take a considerable length of time — and by extension, expense — to wade through each.
“Google is doing this reluctantly, so they’ll do it as cheaply as possible containing the cost,” Rob Enderle, principal at the Enderle Group, told the E-Commerce Times.
“They’ll find it annoying, but the cost shouldn’t be material until the EU decides they aren’t doing enough and hits them with a fine, which is a common series of events in cases like this,” he noted.
“The firm typically does the minimum — based on an interpretation of the ruling with massive pro-firm bias — and finds that the court doesn’t have a sense of humor for this behavior,” said Enderle. “Then it gets expensive.”
Google examines whether results appearing in a name search include outdated information related to a person’s private life. It also determines if there’s a public interest in keeping information in search results — for instance, information about malpractice cases, criminal convictions, financial scams — or, in the case of government officials, public conduct.
Users can contact local regulators if they disagree with Google’s decision — the company concedes that it might not be in the best position to make a final call.
Abuse of System
A key consideration in handling each request individually is the potential for abuse of the system.
“People that jumped at having their information scrubbed were folks that you’d want to know more about — pedophiles, criminals etc. — though this could be artificial, because clearly it would be in Google’s best interest for that to be the case,” Enderle noted.
“Without an independent enforcer, you are much more likely to end up with dramatic theater than with a fix for the problem,” he said.
Those requesting that content about them be removed from search results need to include a photo ID and the URL of each link they’d like removed. They must explain why a URL is irrelevant, outdated or otherwise inappropriate.
Free Speech vs. Privacy
The EU’s decision caused protests from Google and free speech activists, who claimed it would cause a rise in censorship and force Google to determine which information about an individual would be considered pertinent to the public interest, the WSJ noted.
However, some privacy advocates said those arguments were excessive, since search results would be removed only from searches on an individual’s name and not the entirety of Google’s search results.
“I think that if one takes privacy advocates at their word, this process will help — undesirable information becomes less visible. It’s still available from the original source but harder to find. Information costs matter tremendously on the Internet,” Arizona’s Bambauer said.
“The vagueness of the court’s decision leaves Google to do a lot of work to sort out what is relevant, irrelevant or no longer accurate,” he continued. “That’s an uncomfortable position for any company to assume. Google has generally proceeded thoughtfully with this type of decision, but it’s strange to have a search engine overtly making these decisions.”
Google’s decision to act quickly in implementing a removal request form may partly be a move to appease European authorities, since it is tussling with governments and regulators in multiple countries over several issues, ranging from tax considerations to antitrust probes. Regulators from the EU’s member countries agreed to establish a policy for interpreting the policy in collaboration with Google.
However, it appears the company might not accommodate the regulators’ request that no reference be made to information having been withheld from search results. Google often tells users when links to pirated content have been removed, and the company indicated it may treat information removed under EU requirements in a similar way.
The regulators, on the other hand, intimated that appending such disclaimers would undercut the principles of the ruling by making it known an individual wanted some personal information hidden. That potentially could prompt those searching for information about an individual to dig deeper for the suppressed content, perhaps out of curiosity.
Nevertheless, Google has added a blanket notification that appears at the bottom of name-search results on European versions of its search engine, the WSJ reported. The notification reads, “Some results may have been removed under data protection law in Europe.”
One recent high-profile case related to the removal of search results was brought by Max Mosley, the former president of the International Automobile Federation. A French court ruled in November that Google had to automatically block links to nine images of Mosley participating in an orgy. Google had argued that creation of a tool to automatically block links to the images would inherently cause “automated censorship” of the Web.
While such regulation may be enacted in Europe, it’s unlikely that Google ever will have to adopt similar practices in the U.S.
“The European court’s ruling is incompatible with the right to free expression and wouldn’t be possible on American soil because of the First Amendment’s protections,” said Lee Rowland, staff attorney with the American Civil Liberties Union Project on Speech, Privacy, and Technology.
“While the Internet creates new challenges for protecting privacy, we must face those challenges by relying on our constitutional values — like the right to report on public events — not overturning them. Requiring websites to eliminate or hide access to already public information is a troubling precedent that harms the freedom of expression without producing meaningful gains for the right to privacy,” she told the E-Commerce Times.
“There’s little chance that we will see a right to be forgotten in the U.S.,” Arizona’s Bambauer said.
“Search engine results are plainly speech — any attempts to force Google to de-list accurate content would not survive First Amendment scrutiny. In part, this derives from a different set of values: In the U.S., we prize more — and more accurate — information, rather than allowing people to selectively remove parts of the past,” he pointed out. “The EU reverses that calculus. Both approaches are reasonable, but they lead to different legal rules about informational privacy.”