introduction
Play an online game and feel how your blood pressure rises above the complete disappointment of bad sports.
A game is only as strong as the community that supports it. But what happens when a few bad apples disturb the river and prevent others from having fun? Most players have a story of grief or team-killing, or worse, another player cursed them in a way that goes way beyond “trash talk.” In fact, it is a recent study by the anti-bullying organization Ditch the The label said that 57 percent of the teenagers interviewed experienced bullying online while playing games. Even more alarming was that 22 percent said they had stopped playing. Are more people turning away from them as a result of these unpleasant social interactions instead of getting them to play games?
Negative experiences when playing online are nothing new. You can look back at the days of commercial MMORPGs such as EverQuest and Ultima Online and find many examples of these scenarios. A common perception amongst gamblers was that it only belongs to the territory if you want to play online, but that does not make it all right. Playing should bring people together, and as players we all know how powerful these experiences can be. Nobody should have to tolerate hate speech or threats to his safety in order to simply get involved in his hobby online.
This problem has intensified as more and more games evolve and become online-centric. The special emphasis on their social aspects has forced developers to get creative to encourage players to “play well”. With more initiatives and efforts in this area, we’ve talked to industry leaders, from developers who find solutions, to companies that specialize in moderation to gain insight into the ever-growing and complex topic.
Riot has been experimenting with many different design tactics to fix League of Legends issues
It’s about disruption
It’s about interference
The word “venomous” seems to be associated with online games and has been used to describe problematic, negative players who make an effort to make the experience unpleasant for others. Maybe it’s a player intentionally playing a match in Dota 2 or spamming insults in the League of Legends chat for someone to feel bad about their abilities. This is what many developers call “disturbing behavior” and is the preferred term when talking about this type of person.
Regardless of phrasing, it all boils down to one thing: they hinder the way the game is to be experienced. Every developer we’ve talked to about this feature has commented on it and explained why it’s so bad. “It’s in everyone’s best interests to make the game a fun and enjoyable experience, because people go to these games – they want to have a fun time,” said Scott Mercer, chief designer of Overwatch.
This also extends to investing players in a gaming experience. If something does not feel funny or enjoyable then why stay? Dave McCarthy, Operations Manager at Xbox, uses a simple comparison to show how important it is for these digital landscapes to feel safe and secure: “I think it’s as simple as, would you go into a physical space, wherever you are Are you bothered or perceived as undesirable by certain images or languages used there? “No, of course you would not. you come physically out of this room. The same applies to the digital space. “
Overwatch
When players log into games, they pay attention to social norms to get an idea of what is acceptable. Is it a relaxed, funny atmosphere? Are there any serious competitors who want to get to work? For this reason, it is extremely important that the sound in games and services is set early. Chris Priebe, founder and CEO of Two Hat Security, a moderation-delivering company, says the identity of a community is the first day. That’s why it’s so important for those behind the games to build and inform the culture. “When people start a game, they have to think about how to build the community and bring people into the community. I think that in the games industry too often, it starts and culture will shape itself. ‘ “
Priebe discussed how often moderation and chat features in development are considered too late, without much thought on how the community should be designed. He likened it to having a party and how it takes shape when you set the tone. “If you do not specify a tone, it can be very, very bad,” says Priebe. “That’s why people do not have bouncers at the front door, somehow we do not think we have to put bouncers on the front door at games, and we wonder why things go so badly wrong.”
This may seem daunting in recent years. Priebe says he has seen an increased effort to change this. People in the industry are working hard to find answers, whether it’s more transparent policies, better moderation tools, or developing solutions within the game. However, it all comes with time and experience to use the community as a testing ground.
Blizzard has recently introduced a role queue to circumvent disputes over the composition of teams in Overwatch
The mission of the Fairplay Alliance
When developers look for solutions, it becomes clear that collaboration will be an enormous tool in the future. Here, The Fair Play Alliance, a global coalition of gaming companies, plans to “unlock the best possible online experiences for players around the world.” More than 120 companies are represented by a great mix of key players in the industry, including Blizzard, Mixer, Roblox and epic games. “The (Fair Play) alliance is about making a lasting difference to game design,” says co-founder Carlos Figueiredo, who is also responsible for the security of the community and the confidence of Two Hat Security.
With the goal of making games “more positive and productive experiences,” the Fair Play Alliance is able to identify problems before they occur. “Something that people can overlook is the way you design your game, which can add to the experience,” says Figueiredo. “The environment in the game can influence the negative behavior. It’s all up at Allianz . when you really think about game design and how it can be used to facilitate and encourage these positive interactions. “The goal is to better understand players’ needs and ensure that online games are in place. A positive experience for all. “For more information, visit fairplayalliance.org.
The learning process
The learning process
The more people we have talked about this topic, the more it became clear how complicated and difficult an issue is. Most companies experiment with different functions or tools to find out what works, and some still decide where to draw the line between “okay” and “not okay”. says Weszt Hart, Head of Player Dynamics at Riot Games. “It’s difficult to make decisions because it’s so subjective, what’s toxic to you may not be toxic to others, junk talk might be poisonous to some people, to others it’s only with our friends.”
Hart is working on League of Legends, a team-based game that has a reputation for its poisonous community. He says it’s a challenge for the team to figure out which areas it should focus on to solve those problems. To find out what the community considered to be “good” and “bad”, Riot presented the now-defunct tribunal, where players log in and check cases to determine if a perpetrator should be disciplined or pardoned. Afterwards, Riot tried to foster more positive interactions by implementing the honor system to get your teammates recognized, if you thought they had done a good job. “But then we discovered that all of these systems were retroactively installed after the games,” explains Hart. “They did not help avoid possible overshoots, so we had to find out where the problems actually occurred, maybe even before the games.”
Rainbow Six Seige
Enter the teambuilder. “Teambuilders have dealt with addressing, I suppose that’s a flaw in our design,” says Hart. “Because with the development of the community, the concept of a meta developed. The players told us how to play and the system did not recognize their intention. In an effort to play the way they wanted, they chanted the role they wanted in the chat. We needed to find a way to help the system and help players as they wanted. “Riot created a teambuilder so that the games start better and fewer players can participate in games that are already frustrated. Possibility of negative interactions.
While Riot is not the first to deal with players who treat themselves badly, the influence of their systems in the industry can be observed. Take, for example, Blizzard’s cooperative shooter Overwatch. Introduced in 2016, Overwatch was considered one of the more positive communities, but it grappled with the proportion of problem gamers that game director Jeff Kaplan had to address frequently in his developer update videos. Kaplan finally put it in a nutshell: “Our philosophy at the highest level is that if you’re a bad person doing bad things in Overwatch, we do not want you in Overwatch.”
Several improvements were made to the game since the launch of Overwatch: improved reporting tools, a system to support positive results, and last but not least, the queue of roles that eliminated the additional frustrations and squabbles over team composition. The latter two are reminiscent of the honor system and the teambuilder of the league.
Overwatch is a long way from Blizzard’s first foray into the world of online gambling. Therefore, the team expected some problems, but also broke new ground. “I do not think we expected exactly what happened after the start,” says senior producer Andrew Boyd. “I know that we had to deal with many new things. I think this is one of the first games in which we really deal with voice as an integral part of the game and that has changed the landscape a lot. That is, when we saw that, it quickly became very important for us to deal with these issues, and we began to take steps to make the game a better place for people. “
While developers can try to spot potential problems in advance, most of the time they really do not get word out when the game is running. Ubisoft Montreal experienced this first-hand with Rainbow Six victories, forcing the company to combat bad behavior and become creative with its solutions. A player behavior team was formed to “promote the behavior we want to see in the game,” says community developer Karen Lee. Here, the team worked on the Reverse Friendly Fire (RFF) system to help kill teams. “RFF was first developed to curb the effects of players abusing the game’s allied fire mechanics,” Lee says.
The developer updates from Rainbow Six Siege openly discuss toxicity and future solutions
RFF does it, if you try to hurt an ally, the damage returns directly to you. Since then Ubisoft has iterated on to make sure that it works on all different operators and their gadgets. Before a new operator goes live, the player behavior team checks him and tries to figure out how he can be inadvertently used by the community to cause mourning. “We also have weekly and monthly reports that go to the entire team,” Lee explains. “These help everyone assess the health of the community, and we highlight the week’s key concerns.”
Many different gaming companies and organizations have joined forces (see the sidebar of the Fairplay Alliance) to share ideas and work towards change. Although developers have learned a lot about what works and what does not, there is no single solution because all games are different, whether it’s the audience or the genre. “The problem area is too big to look at a specific feature and say,” That’s how you do it, “says Hart. “There are no best practices for so-called player dynamics yet, this is the design field for interactions and motivations between players.” Depending on your game and genre, some things may work better than others. “
Adam Tanielian, Senior Director of the EA Community Team, speaks at this year’s EA Play at the Building Healthy Communities Summit
Example for streaming, gaming and blender
Mixer has become a major force in the streaming world, boosted only by partnering with Tyler “Ninja” Blevins, who has now gained over 1 million followers on the platform. Gaming and streaming are so closely linked that people nowadays discover a lot of the games. This means that the need for moderation for both streamers and their viewers in these areas is growing.
Mixer came on stage in 2016, when it was called Beam, and it has voted against toxicity and promised to improve tools to help fight it. The entire mixer sound is much friendlier and more welcoming than the competition. General Manager Chad Gibson believes that it’s important that Mixer focuses on the community from day one and thinks about ways to promote it. “If there was one thing that probably had the most profound impact, it was transparency and consistency,” he says. This means making the rules for the acceptable as clear as possible and constantly striving to improve them.
Mixer recently introduced a new system called Toxicity Screen that allows streamer to determine the words and type of communication allowed in their chat. Streamers can also tweak it to make it more restrictive for new members, and loosen it for long-time viewers with whom they have built trust. “It’s important that we give the streamers the ability and the tools to drive the growth they want,” says Gibson.
Gibson believes that Mixer, who speaks openly against toxicity, has only helped to reach the community he wants to promote. “We want it to be really clear what is allowed on the platform, and the more consistent it is, the better our streamers can bring their community in that direction.”
Bring power to the community
Bring power to the community
Building healthier communities is not just about developers and publishers. Sure, developing different mechanics and improving moderation tools are steps in the right direction, but they also need the help of the community to be successful. It makes sense. The people who play your game make it what it is and know it best. Because of this, more and more developers and businesses rely on their community to provide feedback and moderate themselves by reporting bad player behavior. “Everyone has to be involved,” says Priebe. “Players have to say, ‘Look, I’m sick of it. “Priebe quickly pointed out that he thinks most players already think so, but they need to lower their foot more and be vociferous to help with the shift culture. “It will take some players to say,” No, that’s not cool. You can not be in our guild unless you have good athletic ability, “he says.
Many believe that the community should be as involved in the process as it is when it gives feedback on beta games. “We need to work with our players and say,” What do you think? “Just like developing our games,” says Adam Tanielian, senior director of the EA community team. “We think the same idea should apply to our communities. How do we keep her healthy? And how do we build tools? “EA recently hosted a summit dedicated to building healthier communities to get feedback from players and developers. The result was a “game council” that, according to Tanielian, regularly meets and resembles what they have for their various franchise companies. However, the focus is on feedback on tools, guidelines and the classification of EA toxicity. “We know we need to take action,” says Tanielian. “We can not just talk about it and do nothing, some things take longer than others, but there are always things we can do, there are always areas we can address.”
Most platforms have parental controls, such as the Xbox One (pictured here), to help parents control their child’s online experience
Many of the people we’ve talked to discussed how important user-friendly reporting tools were, but players should be encouraged to use them. If they are hard to find, players need to visit a website, or are unnecessarily complex, developers and moderators just do not get the valuable information they need. Reporting also tells developers what the community appreciates. “The community itself does in a way drive what’s good and what’s not good for them, in terms of communication and gaming experience,” says Mercer. “I think the most important thing about reporting is that the community can help the police decide for themselves what they think is acceptable or not.”
Players often feel more encouraged to come forward when they know it will make changes easier. Sure, it helps to give players the option to mute or block the players they are rubbing, but when the Overwatch team began to track the reports and let the players know that action had been taken, they noticed that this led to an increase in reports. “It was important to build that trust,” says Mercer. “When you say,” Hey, as a member of the Overwatch community, you’re part of the solution to the problem of players acting badly in a game. “
With Xbox players can search for other players with similar goals to join in games
Self-moderation was certainly the key to getting problem gamers out of games, but Microsoft saw the opportunity to go one step further. For those who only want to play or talk to like-minded people, Microsoft has created the “Clubs” (Online Meeting Rooms) feature on Xbox One, which brings together people with similar values, interests, and goals. According to McCarthy, Microsoft has achieved great success in this area. “We have found that strong communities not only provide secure spaces and norms, but also a degree of self-governance,” he explains.
Microsoft has also used clubs as testing ground for new moderation functions, McCarthy says. One long-term goal of Xbox is to give you more choices and tools for your game. “What I mean is to finally put the knobs and controls in your hands, so you can decide, hey, I want to filter out things that are considered annoying messages, or I’m silly, like, I want that Filter out word “peanut butter” and never see the word “peanut butter” again.
Involving the community and providing moderation tools is a step in the right direction. It’s encouraging that more companies are showing the community ways to help. After all, this is too big a problem for it to be tackled on its own, and it gets more and more complex as games get bigger and bigger and turn more and more into social activities.
Build a better future
Build a better future
The industry does not get better if it is not constantly looking for new solutions, and many companies recognize that more needs to be done as our technology grows. “This must be a solved problem,” says Priebe. “Because games are always voice-controlled, especially since you are more than ever dependent on cooperation. People realize that your friends are there when they have social games. “
While the game developers are lagging behind in this area, there is much hope for the future. “What we’re experiencing while playing is more of a cultural shift over the past 10 years . and it’s up to us to react faster than in the past to stay ahead,” says Rainbow Six’s community developer Win Craig Robinson. “Right now we’re catching up and that’s not where we need to be to get the toxicity under control.” I expect there will be a lot of improvement across the industry over the next five to 10 years, especially as the various publishers and developers share their insights and insights about the Fairplay Alliance. “
Minecraft, the best-selling game of all time, is available on 20 different platforms, so community moderators and parental controls are essential
And many people who deal with this topic have already thought ahead. At present, we rely to a large extent on reactive measures for moderation of people. The problem is that it is later as if the damage has already been done. Many want to be more pro-active, which means they try to anticipate problems before they occur. “I think one of the biggest challenges is sticking in the way we did before,” says Hart. “The social needs of players on the Internet are increasing. We have to see our games differently. We have to be much more proactive. When we wait for a game to think about how people in this game can interact with each other, we are already behind in the backlog because we then have to retrofit systems into an existing game instead of proactively designing to reduce disruption and production support these successful interactions. In short, we have to go from punitive to proactive. “
It is gratifying that the technology is only getting better, and many are optimistic that A.I. will be a great addition to the moderation in the future. “AI is something that could really make a difference when it comes to how we can moderate and how we can scale its size across the community,” says Tanielian, adding that companies like Microsoft have already invested in this area by trying McCarthy says, “It’s really good when these models are more and more trained with more data,” as an example. DNA “, where we tag certain images and actually share that database with a large number of other companies. In my opinion, cooperation in the industry is really important here. Because when we start to share some of these models and learning methods, they become more complex and accurate and can actually help a larger number of users overall. That’s just something we need to focus on time and time again: how do we use such high-performance technologies properly? And to train it industry-wide to do the things we expect from it? “
These are big questions, but we must not leave them unresolved as we spend more and more time in these online areas.
This article originally appeared in the November 2019 issue of Game Informer.