Welcome Guest | Sign In

French Ruling Puts Google Between a Rock and an Orgy

By Kris Holt TechNewsWorld ECT News Network
Nov 7, 2013 1:18 PM PT

A French court has ruled that Google must automatically block links to nine images of Max Mosley participating in an orgy, according to press reports. Mosley is the former president of the International Automobile Federation.

French Ruling Puts Google Between a Rock and an Orgy

The company reportedly must find a way to prevent all links to the images from appearing in its image search results for a period of five years. The order takes effect two months after the ruling, presumably to allow Google time to build a filter tool.

The company will be fined 1,000 euros (US$1,344) every time one of the images is found through its search engine, starting next year, the court reportedly ruled.

Google previously argued such a ruling would be tantamount to "automated censorship" of the Web.

Privacy vs. Free Speech

In 2008, the British tabloid News of the World published a video and story relating to a "sick Nazi orgy" in which Mosley participated. He admitted taking part in sadomasochistic activity with five women and paying them, but brushed off accusations of there being a Nazi theme and claimed the video breached his privacy.

Later that year, a UK court ruled the News of the World had breached his privacy and said there was no public interest in printing the story. He was awarded US$94,000 in damages.

In 2011, Mosley won a similar ruling against the now-defunct newspaper's publisher, News Corp., in France.

For companies that operate in multiple countries, having to deal with privacy and free speech laws that vary in different regions "makes day-to-day operations quite complicated and difficult," Anupam Chander, professor and scholar in the law of globalization and digitization at UC Davis School of Law, told TechNewsWorld.

"Anyone who has the means, as Mosley does, will resort to court orders," he added, in order to have perhaps unflattering content about themselves removed from Google results.

The ruling would force it to create a software filter to automatically detect and block the images, Google said. Because it cannot stop others from reposting them on the Web, it would essentially be compelled to act as gatekeeper, tracking where people were posting the images and stopping others from accessing them through Google's services.

Not Our Job

In September, Google said it had removed hundreds of links on Mosley's behalf, following its standard process of scrubbing links to certain pages after the content has been deemed to violate the law.

The French ruling forces Google to take a more proactive approach in purging the Mosley images.

Google is merely a platform for helping people find content, and it should not be responsible for policing links, it has argued.

Banning the images from appearing in its search results would not stop people from accessing them through other means, such as on social networks or other search engines, Google has pointed out.

It reportedly plans to appeal the French court's ruling, which also ordered payment to Mosley of a token 1 euro ($1.34) in damages and 5,000 euros ($6,717) in costs.

"Google is obliged to follow the law in countries where they have boots on the ground, where they have servers or where they have employees," Eva Galperin, a global policy analyst with the Electronic Frontier Foundation, told TechNewsWorld.

"Certainly, Google does have boots on the ground in France, and so they are obliged to follow French law," she continued. However, "the thing they are being asked to do is so technically difficult and potentially expensive that there's no doubt Google will appeal this decision."

Facebook Twitter LinkedIn Google+ RSS
How do you feel about accidents that occur when self-driving vehicles are being tested?
Self-driving vehicles should be banned -- one death is one too many.
Autonomous vehicles could save thousands of lives -- the tests should continue.
Companies with bad safety records should have to stop testing.
Accidents happen -- we should investigate and learn from them.
The tests are pointless -- most people will never trust software and sensors.
Most injuries and fatalities in self-driving auto tests are due to human error.