top of page
< Back

Pings, Parley, and Pictures - How Players Communicate


GG Vol. 

23. 10. 10.

Games are inherently social. In the wake of MUDs (Multi-User Dungeons) in the late 1970s to MMORPGs in the early 90s, playing games has been heralded as an opportunity to socialise and be social - antithetical to the usual “loner” gamer stereotype that is so pervasive in popular media. More recently, during the COVID-19 pandemic, games offered a pre-existing framework for keeping in touch and hanging out with friends when regions in Canada and the U.S. were facing mandatory lockdowns and curfews to stem the infection rates. Many turned to their headsets and keyboards to play games and catch up with friends when they could not see them face-to-face. However, a caveat to being a social space, is the potential for anti-social behaviours. This is not formed in the lack of socialising, a typical tenant of being anti-social, but rather in the deploying of modes of communication to have a different kind of social “fun”.

So, how do players communicate in games? Not only how, but what do players communicate while playing? Games have encouraged socialisation through various communication channels, both inside and outside of the game world, as a way to organise, chat and, more often than not - to troll. The breadth of research into communicating in games parallels understanding and unpacking the age-old phrase of “toxicity”. Both authors of this article have studied different gaming communities (Overwatch, DOTA2, World of Warcraft, Lost Ark, and MtG Arena), to look at how and what they communicate.

Text Chat and Talking Back

The best place to start is text chat, the longest-standing way to communicate in games. A channel for conversation and information in MMORPGs like World of Warcraft, where players can recruit, sell, chat, and more. Historically, it was the only way to communicate in games, until the introduction of voice chat, and since that point, it is regarded as a more restrictive way to chat in games.1) Text chat evolved in response to this. Players generated and built their own game-specific lexicon and abbreviations to make using text chat efficient once more for instantaneous conversations. A simple example of this can be found in League of Legends (Riot Games, 2009). When players load into their team screens, they typically head to the text chat to claim a “lane”, typing “mid”, “bot”, or “top” for top, middle, and bottom lane. A quick way to allocate yourself to a particular lane. A similar example of text chat being used to convey a message efficiently is in MMORPGs like Lost Ark when a world boss appears (a large enemy that appears on a particular schedule and needs multiple players to take down). In this case, a player might type that the world boss is “up” and what channel to join to fight it. Text chat can scale, from the micro to the macro, from one-to-one up to the entire server’s worth of players. This reach comes with consequences.

Hate speech can be easily spread via text chat. Whether it is racist, sexist, or homophobic slurs, aimed at no one or everyone, they are regularly spotted in text chat. This problem has become so pervasive that many game companies have automated filters to block out hateful terms. The issue arises further when players get creative in how they write these words. Devin Connors, a community manager at Psyonix, discussed Rocket League’s language ban and chat filter system at GDC 2018 (Image 1).2) The team initially had a list of 20 bannable words, which has since grown exponentially to include misspelling variants and swears or slurs found in other languages.

* Image 1 - Devin Connors presenting the Rocket League Language Ban system at GDC 2018 (author screenshot)

Blizzard manages poor sportsmanship displayed in text chat in a more tongue-in-cheek approach. When players type “GG EZ” at the end of a match in Overwatch, insinuating that the game was no challenge to beat the oppositional team, the acronym is swiftly corrected into one of many silly phrases (e.g. “It's past my bedtime. Please don't tell my mommy.” Or “Gee whiz! That was fun. Good playing!”). This is done as a way to de-escalate a micro-moment of toxicity without having to bring in the threat of a ban for behaving in poor taste. However, even seeing GG EZ replaced with messages like these, players will be aware of what was initially written, regardless of the appropriate veneer that was placed over it. Obviously, this type of behaviour is low on the threshold of toxicity compared to what the Rocket League team has to filter out, but is equally present in Overwatch matches.

This all solely focuses on the use of just text in text chat, ignoring the considerable use of emoticons, emojis, and stickers that we use to communicate in our day-to-day texting, let alone during gameplay. Emotes and emoticons have been a staple of more complex communication systems in games, but they have become common as the sole method of inter-player communication in popular online card games like Legends of Runeterra (Riot Games, 2020), Magic: The Gathering Arena (Wizards of the Coast, 2018), and Marvel Snap (Nuverse, 2022). While Magic has text-based emotes, players are more likely to use any of the numerous animated stickers available in each of these games (Image 2).

* Image 2 - A Collection of Emotes from Magic: The Gathering Arena - Authors’ Screenshot

While it might first appear that limiting communication to a reasonably small set of phrases and animated images would limit player toxicity, this is not the case. Players are actually able to do quite a lot with very little - often going beyond what the intended function of these emotes might be. In Magic for example, one way to greet an opposing player at the start of a match is with an emote that depicts one of the game’s characters, Gisa, waving at you using the hand of a zombie (Image 3). While this purpose is a quirkier, possibly more fun alternative to the standard ‘hello’ emote, this emote can have other, more sinister uses.

* Image 3 - Gisa Waves in Magic: The Gathering Arena - Author’s Screenshot

Imagine a common scenario in a game of Magic: two players have been filling the board with creatures over several minutes, incrementally trying to beat the other. The game is a close one, with each player seeking to get just enough of an advantage with each new card played on the field. But then one of the players drops what is known as a ‘board wipe’ - a card that removes a massive number of cards from the battlefield that a player has spent an entire game establishing. And then the player who destroyed the board uses the Gisa emote, not to say hello, but to say “Goodbye to all your cards, and goodbye to all your fun.” This is but one way that players use these emotes. The important thing to take away from this example is that players develop their own use cases and interpretations of these emotes over time, and not all of them are positive and friendly, even if designers intend them to be. Some players will find a way to use them to troll players and these uses can pick up steam throughout a game community.

It isn’t just the players acting alone here, however. One final point on these emotes is that they are often riffs on popular memes. For example, one Magic emote depicts the character Saheeli eating from a bowl of popcorn, inspired by the gif of Michael Jackson eating popcorn in a movie theater and other related images of popcorn ingestion (Image 4).

* Image 4 - Saheeli the Emote and Michael Jackson Enjoying Popcorn.3)

The memes these emotes are inspired from often have the purpose of poking fun at something - particular rules and use cases that are often meant to turn a situation into a joke. The popcorn-eating Michael Jackson is often used when reading a lot of gossip in a forum thread, or when observing a social disaster or drama, for example. Emotes based on these meme formats come preloaded with meaning, not often positive, with only a player’s opponent as the possible audience for the message that the emote sends when it is used. While basing emotes off of memes creates a shared language between players that makes them more easily readable as artifacts of in-game communication, they also skew towards antagonism because of the way memes make a joke out of most situations. As often as the silliness of these emotes might defuse hostile or negative feelings during play, they are just as likely to produce them because of how they are used and their established associations. This is not an argument for or against emotes one way or another, but is instead meant to highlight that the culture of communication that games are nested within affects even the most limited forms of interplayer communication used in online games.

Pinging to Point and Pout

Sometimes, words and images just don’t cut it for conveying messages quickly during gameplay. Typically found in MOBAs like League of Legends or DOTA 2, “pinging” is where a player clicks on a map area, item, or character, and it lights up to notify other players. These pings can be signified with an exclamation point or question mark to draw the eye to the area.4) These quick signifiers can be used for strategising, planning a route as a team, pointing out important items for teammates to collect, or as a warning system to avoid certain areas.5).

An anecdotal example of the layers involved in pinging comes from one author’s experience playing DOTA 2 for the first time not with the AI but real fellow players. While playing one of their first matches, they noticed that another player was pinging the area around them. Only because someone familiar with the game was supervising was it made evident that this other player was trying to get their attention, flagging that they were making an incorrect choice, or they were in the wrong spot for that moment in play. More can be said about the lack of a tutorial preparing a new player for all the nuanced ways that players might communicate with you during a match, but that is down to community-constructed modes of communication, which is hard to cover within a game’s onboarding tutorial. Aggressive pinging, where a player will spam click the ping button, is often a signifier of frustration6) for whatever another player is doing. It can also be a way to distract a player if a fellow teammate has opted to throw away the game and bother their teammates instead.

A New Player in Voice Chat Moderation

Voice chat is still a staple feature in many online games from first-person shooters, to multiplayer survival games, to large-scale group play in numerous MMORPGs. Voice communication affords players more opportunities for complex sequences of expression, which are often necessary for fast-paced online play. The catch is that the use of voice often produces in-game environments where players are able to say whatever they want to teammates and random players, which includes a substantial amount of toxicity7). Voice chat brought a more real-time way to communicate in games; a technological revolution in how players could coordinate and socialise. However, in doing so, voice chat removed a level of anonymity to players, exposing their identity (race, gender, sexuality) through what Kishonna Gray calls “linguistic profiling”8). Players are very quickly reminded of these intersections of their identities through hateful terms and treatment from other players in response to using their voice in voice chat. Compared to moderating text chat voice has been a difficult facet of online play to manage9), no doubt due to the amount of voice chat happening in games and the speed at which it occurs. It is no secret that players have requested some kind of integrated voice chat moderation, with some doing so since 2017 10).

Even though the moderating voice is a lofty task, one company, Modulate, is at the forefront of this endeavor. Modulate are the creators of ToxMod, a voice moderation technology designed to help game companies identify, triage, and proactively manage instances of toxicity that happen in the voice communications of their games. This past August, Modulate partnered with Activision to implement ToxMod in the upcoming Call of Duty: Modern Warfare III (Activision, 2023). In a conversation with Modulate COO Terry Chen, he expressed the importance of keeping in-game communication healthy:

“Our overall intent is not only to protect people that are suffering from this marginalization, but also to make gaming and its spaces more fun. I think fun is at the forefront of what we do. [...] Voice chat, which has become more critically important in esports, especially for games like Valorant, where you need [voice chat] for a tactical advantage against the enemy opponents, there’s just this level of toxicity that makes it impossible to enjoy the game that you’re trying to love, and also improve.”

To accomplish these goals, ToxMod uses machine learning technology to detect and rate toxicity, but ToxMod does not ban and punish users on its own. Instead, Terry views ToxMod as “a collaborator, kind of an additional player in the game that can listen and help out if necessary.” This is because the toxicity that ToxMod detects is flagged for moderators who have the job of making the final decision on what actions need to be taken against players, so ToxMod operates in partnership with developers and moderators to address the issue of toxic communication.

To close this article, I asked Terry, as someone who is on the front lines of addressing toxicity, what more could be done by companies and players alike to work towards a solution? We’ve seen how toxicity is common across each of the in-game communication mechanisms we’ve explored, so what are we to do that isn’t being done? Terry offered two important solutions: 1) Listening to players from across a game’s player base rather than focusing on the needs of the most skilled or highest profile players as there is valuable feedback from more than the pros. In fact, most players are not playing at the highest skill levels and can provide a lot of valuable information about what is happening throughout the most densely populated segment or rank of a game. 2) To think about detecting and rewarding positivity. According to Terry, “The truest action would be implementing tools, whether it’s Modulate ToxMod, whether it’s something developed internally to detect bad behavior, but also reward positive behavior.” As players, developers, and researchers we find ourselves so confronted with the negative aspects of in-game communication that we take our eyes off the players who are setting good examples - and more work should go into refining and implementing systems that encourage more positive interaction - not just mechanically, but in the ways we communicate with one another in-game. On the other side of the avatar, we are real people after all.

As we can see, there are two important facets to consider when discussing communication and social systems in games. The first is how to moderate them. It is a trepidatious task that requires people power, tech power, and clear guidelines to enact any form of governance in a social space. When introducing a competitive angle to gameplay (whether as an aspiring pro player or as a player who simply enjoys competing in the game space) - the stakes go up, and so there is more on the line for players to care about. In the same space, we have players who are just there for the vibes. To play with their friends, regardless of the outcome (though they would like to win). The second facet is “trolling” and toxicity via all these different modes of communication. Players will find ways to get creative with any system, to subvert it to their own wishes and enact toxicity however they see fit. Ultimately though, it goes back to the very start of this article; that games are inherently social. It is not the sole aim of most players to go online and be toxic, but rather to join into the collective and have a good time with others who enjoy the same play space.

Turning to the future of communication in games, voice chat has more recently become somewhat fragmented with the success of Discord. Many players have shifted their voice communication from the dedicated game servers to their own personal, curated community Discord servers, where they hang out as a collective with friends instead of strangers in a game lobby. Though some game-specific Discord servers exist, they are less for communicating during play and more for marketing and building a community around the game. Virtual reality could yet revolutionise how players embody communication during play, though right now they are awkward half-bodied avatars with nausea-inducing equipment for some. There is potential on the horizon, and yet one thing is for certain - where there is a will, there is a way to subvert social spaces.


*For more on modulate you can visit their website,

1) 2) Wadley, G., Carter, M., & Gibbs, M. (2015). Voice in Virtual Worlds: The Design, Use, and Influence of Voice Chat in Online Play. Human–Computer Interaction, 30(3–4), 336–365.

3) Saheeli image from (accessed September 24th, 2023). Michael Jackson image from

4) Leavitt, Alex, Brian C. Keegan, and Joshua Clark. ‘Ping to Win? Non-Verbal Communication and Team Performance in Competitive Online Multiplayer Games’. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 4337–50. CHI ’16. New York, NY, USA: Association for Computing Machinery, 2016.

5) Wuertz, Jason, Scott Bateman, and Anthony Tang. ‘Why Players Use Pings and Annotations in Dota 2’. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 1978–2018. CHI ’17. New York, NY, USA: Association for Computing Machinery, 2017.

6) ibid.

7) Reid, Elizabeth, Regan L. Mandryk, Nicole A. Beres, Madison Klarkowski, and Julian Frommel. “‘Bad Vibrations’: Sensing Toxicity From In-Game Audio Features.” IEEE Transactions on Games 14, no 4 (2022): 558-568.

8) Gray, K. L. (2014). Chapter 3 - Deviant Acts: Racism and Sexism in Virtual Gaming Communities. In K. L. Gray (Ed.), Race, Gender, and Deviance in Xbox Live (pp. 35–46). Anderson Publishing, Ltd.

9) Märtens, Marcus, Siqi Shen, Alexandru Iosup and Fernando Kuipers. “Toxicity Detection in Multiplayer Online Games.” Proceedings of the 2015 International Workshop on Network and System Support for Games (NetGames). 03-04 December, 2015, Zagreb, Croatia, 1-6.

10) Blamey, Courtney. ‘One Tricks, Hero Picks, and Player Politics: Highlighting the Casual-Competitive Divide in the Overwatch Forums’. In Modes of Esports Engagement in Overwatch, edited by Maria Ruotsalainen, Maria Törhönen, and Veli-Matti Karhulahti, 31–47. Cham: Springer International Publishing, 2022.


ping, voicechat, MTG, emote, toxicity


Courtney is a Communication PhD student and game designer at Concordia University, Montreal, Canada. Her doctoral research concentrates on the process of meaning-making in games tackling serious themes and exploring this relationship between player and designer in her own critical game design process. Her previous research unpacked Blizzard’s approach to community moderation in Overwatch by investigating both developer and community inputs on forums. She is a member of the mLab, a space dedicated to developing innovative methods for studying games and game players and TAG (Technoculture Arts and Games).


Marc is a PhD candidate in Concordia University's department of communication studies in Montreal, Canada. Marc’s research focuses on toxicity in online games. He is driven to understand toxic phenomena in order to help create more positive conditions within games with the ultimate hope that we can produce more equitable and joyful play experiences for more people. He has published on the Steam marketplace and DOTA 2, and is a co-author of the upcoming Microstreaming on Twitch (under contract with MIT Press).

bottom of page