Club Penguin is a virtual world for children and was designed to be a safe online space for children and now has more than 200 million members. Netopia had a chat with Club Penguin’s European director Lucy Woodward.
What is Club Penguin’s strategy for protecting children online?
– We have a positive approach to internet safety, we want children to be equipped to get the best out of the Internet. Many of our players are six and seven year-olds and these are formative years. Club Penguin might be their first online club, so we are a training ground for them to test their online skills. Kids are able to test the boundaries and make mistakes but the consequences of breaking the Club Penguin rules are minor but the lessons learned from getting a 24 hour ban for swearing or losing some igloo items as a result of sharing their password can be taken anywhere online. It’s like birds in a nest, we prepare the children so they can fly on their own. We want them to learn enough to understand user responsibility, for example.
– We also work with the parents to provide them with conversation starters that they can build on as their child gets older. We want it to be instinctive for them to talk about what their children are doing online, what games they are playing, who they are playing with so their children feel comfortable to say when something has gone wrong. Just as we teach our children to cross the road we as parents need to know how we teach our children to stay safe online.
Is there a conflict of business success versus responsible conduct as a service provider?
– Child safety is at the heart of everything we do. Club Penguin was founded by three Canadian dads who wanted a safe and fun place online for their children. Safety was built into the design from the start. When kids feel safe they are more likely to have fun. We want to be a place where children want to come and play and where parents are happy for them to do so.
– Online safety should be the first priority for anyone who operates a space for children online.
What are your best methods to achieve these goals?
– We use a mix of technology and human moderation. We have developed our own, bespoke, white list moderation technology. In simple terms this means rather than blocking words that are not allowed, we build the community’s vocabulary from scratch, so unless we’ve put it in our dictionary of words, the word won’t get through our filter system. On top of this, we add another layer that is a bit like predictive text but for sentences. This allows only certain combinations of words to be said together. This means players can chat easily in game but are not able to use offensive language, be mean or give out personal information. This is all underpinned by our moderators, we have over 200 in different sites around the world who help us monitor the kids in Club Penguin, help foster a fun and friendly environment but who also make sure that as language changes the vocabulary in our filtering system reflects this. We are always working on being one step ahead.
– When you make children feel safe it lets their creativity and expressivity flourish. We have to constantly update the content and introduce new stuff, they get bored very fast. We also look for any country specific opportunities for Club Penguin to feel locally relevant to children and parents in what is a global game.
What if someone hacks a player profile and uses it to spread abuse?
– Such language would not get through our filters. These things happen, kids share passwords, but if we see unusual activity, we reset passwords instantly. And we’re always on the other end of the phone.
This sounds like censorship?
It’s deemed positive by both users and parents. We have moderators playing the game, they are present in the environment. They contribute to a good climate. It’s the combination of technology and people that is key to success.