Who Polices Virtual Worlds?
Virtual worlds represent different things to different users. Major corporations set up shop to do business in online realms, while right around the corner, other users are acting out the deepest, darkest, weirdest desires of the human id. In worlds where identity can be pure fantasy but money and intellectual property are real, who lays down the law?
07/31/08 4:00 AM PT
Law and order is one of the cornerstones of a civilized society. Establishing rules of conduct, spelling out acceptable and objectionable behavior, defining the consequences for anyone who violates those laws and deciding who will enforce them are all essential to maintaining peace and harmony.
In the real world, there exist systems of laws -- civil and criminal -- that govern people's behavior. However, in the digital world, who lays down the law? As virtual worlds such as Second Life and Eve Online continue to grow, attracting more and more residents -- and often for vastly different reasons -- who decides what goes and what doesn't? What constitutes a crime in these worlds? Do real-world laws apply, or is up to the creators of these fantasy lands to police their own environments?
A Crime by Any Other Name
Whether it's a massive multiplayer online roleplaying game (MMORPG) like "World of Warcraft" or a virtual world like of Second Life, each digital environment has a fundamental set of rules and regulations users agree to obey when they sign up. However, the nature of those rules depends greatly on the type of environment one is talking about.
In virtual worlds, so-called griefers cause mischief either individually or as the member of a gang of avatars, all for the joy of wreaking havoc. Others have attempted to use online realms as venues for activities like illegal online gambling. And theft of intellectual property in a virtual world can result in real-world lawsuits, as in the case of the code writers who sued a group of 11 people for allegedly ripping off their designs for unique Second Life props.
But what may be anathema in Second Life is part of the gameplay in other worlds such as "Vendetta," said Eric Goldman, an assistant professor at Santa Clara University School of Law and director of the High Tech Law Institute.
"Stealing cash, stealing economic value, is a crime. It's a crime in every system I know of. Taking property that's not yours, that's a crime, but when we're dealing with virtual property, that question might get a little more sophisticated. How do we know what's somebody's and what's not somebody else's?" he queried.
"Particularly if there is a set of algorithms or rules in the game that everything is free for the taking -- that might is right. So we have to be very careful about terms like 'stealing' because stealing might be clearly illegal, clearly criminal, and that's nothing new. We've done that for millennia. Or it might be part of the gameplay, at which point it is not illegal, it's encouraged as part of the overall interrelationships between the residents of the environment," Goldman told LinuxInsider.
Governance and control is handled in different ways in different virtual worlds, Andrew Wall, research director for Security, Risk & Privacy at Gartner, told LinuxInsider.
"There is no one, single approach taken by the providers of the virtual environments. Some virtual worlds, such as Second Life, were designed to include a minimum amount of governance. Residents of these worlds were expected to establish their community behavioral norms. Other virtual worlds include strong governance and control. In these worlds, such as Habbo Hotel, the provider of the environment sets and enforces clear rules for behavior," he pointed out.
Rules of the Game
In some regards, a virtual world's operators don't even need to establish certain rules of conduct because they effectively control reality. In some virtual worlds, it's simply impossible for one avatar to kill another, for example.
"These are things that people can or cannot do to each other based on the way the virtual world operator has designed their environment. Violating these rules is simply impossible because the gameplay doesn't allow it to occur. The code doesn't function that way," Goldman said.
In other cases, the provider may punish rule breakers by fining their virtual bank accounts.
Lastly, there are social norms. Here, there is often no actual retribution imposed on a person for their choices, though a virtual society may simply reject and shun those displaying the behavior.
"Who governs virtual world environments? They are governed by [criminal laws], governed by torts, governed by the virtual world providers' rules and they are governed by social norms," Goldman stated.
These elements come together and jointly are responsible for helping maintain law and order.
"It's a joint responsibility. Everyone provides a piece of the puzzle. We have our criminal system; that provides a piece of the puzzle. Our judicial system that enforces torts; that provides a piece of the puzzle. We have our virtual world provider, and then the community itself does its own enforcement, and that is a piece of the puzzle. It's a shared responsibility, and it would be a mistake to focus on only one aspect," Goldman explained.
"Congress has said, for the most part, they are not responsible for policing their own environments. It's totally up to them to decide how little or how much they want to do. They might decide they want to make their world a police state and will regulate users' interactions with each other very tightly. 47 USC 230 says they can do that and they are not liable for the way they execute that philosophy," Goldman said.
"Similarly, they can say, 'We're going to have a lawless environment, and 47 USC 230 says if they want to do that they are still not responsible when people engage in bad behavior toward one another,'" he added.
The Sherriff in These Parts
In Second Life, Linden Labs has established a system to monitor the 60,000 to 70,000 "active" residents online at a given time and maintain order among its millions of subscribers.
"There are numerous gatekeepers in the virtual world, safeguarding against numerous types of inappropriate behavior," Peter Gray, a Linden Lab spokesperson, told LinuxInsider. "In Second Life, for instance, gatekeepers include our payment and credit card partners, and our fraud and risk teams, who guard against credit card fraud, money laundering and phishing. They also include our own governance team -- the 'G-team' -- which investigates abuse reports and guards against inappropriate content, against teens on the adult grid, etc. And in many cases, they include our own users, who protect the space and 'land' they inhabit," he explained.
Second Life's governance team enforces rules, both proactively and based on abuse reports submitted by the site's residents.
Because Second Life residents are often personally and financially invested in the platform, there are inherent incentives for following the Community Standards, the guidelines for appropriate behavior in-world, Gray noted.
"Likewise, our enforcement efforts deter inappropriate conduct," he added.
In general, disagreements between residents typically deal with offensive language or content and breaches of Community Standards. However, when a resident has violated the rules in Second Life, the vast majority of the time, a simple one-hour ban and an e-mail warning is successful, Gray stated.
"As with any online community, there are occasionally those who seek to disrupt others' enjoyment, but we're able to effectively deal with any repeat disruptors by increasing the severity of the ban up to and including permanently terminating accounts," he continued.
Recently, however, the site has struggled with its so-called ageplaying community. Ageplayers are adult users whose avatars bear the likenesses of children. Second Life has had to specifically ban portrayals of sexual contact between an adult and a child, as well as other portrayals of "broadly offensive content." Some ageplayers counter that sexual play is not a part of their online roleplaying and that they have a legitimate right to portray themselves as children in the virtual world.
"Virtual worlds that incorporate content that is inappropriate for minors generally require users to declare that they are old enough to participate. Most worlds that allow 'adult' content provide clear warnings concerning the content," Gartner's Wall said.
"Avatars are not people, they cannot be assaulted, sexually or otherwise. The one or two reports of sexual assault on avatars cannot be regarded as actual cases of assault any more than a description of an assault in a book or enacted in a play can be construed as being an actual assault," he added.
In addition to its other safeguards, Second Life also offers tools that help its residents 'protect' themselves in-world. For example, landowners can control specifically who is allowed to access their space and can choose to ban individuals that they find disruptive from their land. Additionally, there is an Abuse Report function built into the Second Life viewer so that residents can easily flag inappropriate content or behavior.
For enterprises, the Linden Lab's recent release of Second Life Grid, a version of the software that can be installed and run on internal IT environments, enables organizations to construct their own worlds with whatever level of security or governance they wish, Gartner's Wall noted.
"The public version of Second Life will continue to grow and evolve. It is not clear that virtual worlds such as SL require more security than is currently provided to serve the current use cases. If virtual worlds target commercial clients for remote presence meetings and so forth, security improvements will be necessary to support business requirements for confidentiality and accountability," he said.