A patrol for the Web’s playgrounds

Published 4:00 am Monday, February 28, 2011

The dinosaurs didn’t know it, but their world might have narrowly averted upheaval this month.

For two years, all the denizens of Webosaurs, an online virtual world for children 5 to 12, could customize their dinosaur avatars with leather armor and other whimsical outfits.

Recently, though, the Webosaurs founder, Jacques Panis, decided that leather armor should be available only to premium members, who pay about $6 a month. Players with free membership would be denied that attire.

Then the Metaverse Mod Squad stepped in. The company employs moderators around the country who monitor the Webosaurs site to keep its users safe and happy.

In this instance, it told Webosaurs that if the change were made, the free users might abandon the Webosaurs world or turn on one another. In the end, the dinosaurs kept their armor, and Webosaurs avoided the possibility of alienating some of its 1.5 million registered users.

“I’m running a business, but Metaverse Mod Squad, as the moderators and community managers, is the voice of the kids,” Panis says.

Since starting Metaverse in 2007, Amy Pritchard, its chief executive, has emerged as an industry expert in creating safe, engaging online communities for both children and grown-ups.

Handling kids with care

Metaverse has a client list that includes the Cartoon Network, the National Football League, Nickelodeon and the State Department. It employs an army of workers — often stay-at-home moms — to monitor and moderate websites where children create their own characters, or avatars, and can interact with thousands of other users. Metaverse’s employees frequently create their own avatars to help maintain the peace.

Pritchard says the stakes are higher in online worlds intended for children, like Webosaurs. In more adult-oriented sites like Second Life, users must be at least 16 and are presumably more equipped to deal with the threats of online interaction.

She has found that keeping children safe has a lot to do with keeping them entertained. “If you just release kids into these online playgrounds with no one to monitor them and no rules, it’s ‘Lord of the Flies,’” she says. “But if you can balance safety with fun and engage the kids, I guarantee you’ll have a site with a great group of kids and no cyberbullying.”

She stumbled onto a business idea while exploring the virtual world of Second Life with her husband, Ron, who had taken a job at Linden Lab, Second Life’s creator. Pritchard was taken with the breathtaking landscapes, elaborate buildings and whimsical avatars — from long-legged blond bombshells to blue giraffes — that users created for themselves. But she says she noticed that few users visited some of the elaborate environments created by major corporations because the companies offered nothing to do there.

“Companies had no idea how to create relationships in 3-D,” she says.

Pritchard, however, knew exactly how to make friends online. As a side job, she had moderated message boards for the WB television network and had struck up close friendships with several other moderators.

After introducing them to Second Life, she persuaded five of her moderator friends to create avatars and join her regularly at a Second Life virtual sports bar called the Thirsty Tiger.

There, Pritchard struck up a friendship with the bar’s creator, Mike Pinkerton, a real-life lawyer in New Orleans. One night in July 2007, she ran this idea past him: What about a virtual company, providing remote moderators to staff Second Life sites for corporations, and to moderate Web forums? Pinkerton signed on as chief operating officer of the fledgling business.

Pritchard soon had a chance to test the method against a far more demanding audience: children. While Second Life thrives on a free-wheeling, anything-goes culture, a different breed of virtual world began to proliferate soon after Pritchard started her company.

The sudden growth of Club Penguin, acquired by the Walt Disney Co. in late 2007, spawned a galaxy of virtual worlds for children. Suddenly Barbie, Build-a-Bear, Webkinz and countless other toys, games and entertainment properties had their own mini-universes, where children could create avatars, play with one another, care for virtual pets and furnish virtual dream homes. The rise of social media, meanwhile, produced another explosion in social games like Zynga’s FarmVille, where players create characters and play cooperatively.

Putting children into these social environments raises risks of predators, privacy breaches, inappropriate conversations and bullying.

“Anybody can buy a profanity filter, but kids have all kinds of work-arounds,” says Anne Collier, co-director of connectsafely.org, which promotes the well-being of children. “There really is no substitute for human moderation.”

But not all companies can afford, or have the expertise, to hire an in-house moderation team, and they prefer to outsource to Metaverse or a handful of similar firms, including LiveWorld and ICUC.

Second chances

Pritchard’s approach to child safety is more camp counselor than cop. When children misbehave, Metaverse moderators send a private message to the miscreant, with a warning. Repeat offenders may receive a five-minute muted timeout or can be ejected from a site.

“Our policy is firm forgiveness,” Pritchard says. “Sometimes kids, and adults, too, come into a new environment and feel nervous or scared, and get attention by saying something inappropriate. By giving a warning or turning it into a joke and saying, ‘Come join us,’ you’ve given them a second chance to be part of the community.”

Many companies have found Metaverse’s combination of surveillance and social direction appealing, both from a safety perspective and from a brand-management point of view.

In the NFL Rush Zone, the league’s virtual world, Metaverse avatars in striped referee shirts greet children with high-fives and hand out pigskins, the game’s virtual currency.

“For many of the kids, that conversation between their avatar and the referee is their first connection with the NFL,” says Peter O’Reilly, the league’s vice president for fan strategy and marketing. “We needed a safe space that promoted the values of the NFL and moderators who were passionate about the teams.”

Recently, when a live chat with Drew Brees, the quarterback of the New Orleans Saints, was delayed by 45 minutes, Metaverse referees pacified some 10,000 restless children with trivia contests and games, then rewarded them with pigskins for waiting patiently.

Still, outsourcing moderation does not work well for every company. Melissa Parrish, an analyst at Forrester Research specializing in interactive marketing, said a possible drawback of outsourced moderating was that “you have someone who’s not embedded in your company talking as if they are.”

To avoid losing touch with their users, some clients, like Webosaurs, insist on having a Metaverse manager working in the clients’ own offices, rather than managing the moderation remotely. “The manager sits right here, and is involved in our ongoing development efforts,” said Panis of Webosaurs, which is based in Dallas and owned by Reel FX. Employing an entire team of its own in-house moderators, he says, would not be cost-effective.

Many moderators

To staff a project, Metaverse assigns a manager, one of the company’s 115 regular employees, to oversee it. Managers then draw on a pool of 500 prescreened moderators around the country, many of whom are stay-at-home parents, students and others with flexible schedules.

The pool gives Metaverse quick access to moderators with expertise in a wide array of subjects, from the NFL to Harry Potter. For one project, the company had to find people to judge user-submitted rap videos for a contest sponsored by a major record label. “We needed people who knew specifically about East Coast and West Coast rap, and would recognize gang signs” so they would not be shown, Pritchard says.

Because she started Metaverse as a way to spend more time with her daughter, the company endorses a family-friendly culture, and does not require specific hours, even for its regular employees. Until a year ago, the company didn’t even have an office. Instead, the staff met regularly in its swanky virtual headquarters in Second Life London. Now, 35 employees work out of Metaverse’s brick-walled studio in Sacramento or from a small office in New York. The rest work from home.

In the last year, Pritchard has found that Metaverse’s approach to dealing with children also works for customer service, which companies increasingly provide via corporate Facebook pages, Twitter feeds or other social media forums.

Companies including Kabam, the social game developer, and Horizon DataSys, the data recovery firm, have hired Metaverse to provide online customer service. For social media support, interactions between moderators and customers occur in text, via instant messages, Facebook or e-mail. That makes these exchanges easy to monitor, says Charlene Li, founder of Altimeter, the technology research firm, and author of “Open Leadership: How Social Technology Can Transform the Way You Lead.”

“With a call center, you can only monitor about 5 percent of your calls,” she says. “Here you can monitor every single one, and if the tone isn’t quite right, you can correct it immediately.”

While customer service and children’s virtual play may seem worlds apart, both ultimately come down to respectful communication in a social environment.

“They hire us,” Pritchard says, “because we know how to have conversations when millions of people may be listening.”

Marketplace