background preloader

Online Toxicity

Facebook Twitter

Enterprise Search. Sea of Thieves team put anti-toxicity strategies into game’s core design. Part of Rare’s decision to build anti-toxicity measures directly into Sea of Thieves’ core design can be seen almost immediately upon playing.

Sea of Thieves team put anti-toxicity strategies into game’s core design

Players have a brig aboard their pirate ship, for example. Jerome Hagen, a user researcher at Microsoft, said that feature was one of many implemented to let players deal with others griefing their game. Hagen spoke about design strategy at GDC 2018 today in a panel on empowering players and combating multiplayer toxicity. The concept came down to giving players more control over their game-playing experience, while reinforcing the notion that disruptive behavior won’t be accepted by the Sea of Thieves community.

This helps alleviate the stress that players often have over reporting another player, according to research presented by Microsoft at the panel. The brig is the most interesting example of this. Alphabet Has a New Tool to Weed Out 'Toxic' Online Comments. A research team tied to Google unveiled a new tool on Thursday that could have a profound effect on how we talk to each other online.

Alphabet Has a New Tool to Weed Out 'Toxic' Online Comments

It's called "Perspective," and it provides a way for news websites and blogs to moderate online discussions with the help of artificial intelligence. The researchers believe it could turn the tide against trolls on the Internet, and reestablish online comment forums—which many view as cesspools of hatred and stupidity—as a place for honest debate about current events. The Perspective tool was hatched by artificial intelligence experts at Jigsaw, a subsidiary of Google-holding company Alphabet (googl, -0.14%) that is devoted to policy and ideas.

The significance of the tool, pictured below, is that it can decide if an online comment is "toxic" without the aid of human moderators. How One Twitch Channel Is Fighting Abuse in the Gaming Community - Waypoint. Get the VICE App on iOS and Android Twitch, the video game streaming service, has enjoyed a swift and seemingly unshakeable rise since its start in 2011.

How One Twitch Channel Is Fighting Abuse in the Gaming Community - Waypoint

It has already become an established presence in online streaming and gaming communities, yet that success hasn't made it immune to the familiar toxicity problems that plague other aspects of gaming culture. Women, LGBTQ folk, and people of color are often subjected to a deluge of discriminatory abuse via Twitch chat that is at best disgusting and at worst potentially life-threatening. While Twitch is working to find a solution, current practices aren't proving terribly effective. Misscliks, a Twitch channel founded by four women with prominent backgrounds in eSports and gaming, is hoping to end the abuse with a different approach to community management.

VICE interviewed Misscliks co-founder Anna Prosser Robinson about the organization's efforts and challenges. You're trying to reach people. Google étudie comment traquer automatiquement les commentaires haineux. Une étude révèle aussi qu’un petit nombre d’internautes peut être responsable de près de 10 % des commentaires haineux sur un même site.

Google étudie comment traquer automatiquement les commentaires haineux

La fondation Wikimedia et Jigsaw, l’incubateur d’Alphabet (maison-mère de Google), ont publié le 7 février les résultats d’une étude menée l’an passé. Son objectif était de « mieux comprendre la nature et l’impact du harcèlement », afin de développer des « solutions techniques » efficaces permettant de le limiter. Pour ce faire, ce sont pas moins de 100 000 commentaires publiés en 2015 sur Wikipedia qui ont été analysés par des humains. Ces derniers ont dû en extraire les propos haineux, avant de les classer en trois catégories : les attaques personnelles (« tu crains »), celles visant un tiers (« il craint »), et celles indirectes (« quelqu’un a dit que Bob craignait »).

Grâce à ce qui est décrit comme « la plus grande base de données publique » sur le sujet, un algorithme a été mis au point. Un tiers des harceleurs ne sont pas anonymes. 'Why we won't just log off': Online harassment in the game industry. A remarkable scene unfolded at the Melbourne Convention and Exhibition Centre this past Saturday.

'Why we won't just log off': Online harassment in the game industry

Amid a panel about online harassment where two women had told heart wrenching stories about their own experiences, a man in street clothes--who had conferred with the moderator in whispers but moments before--took the stage and availed himself of an empty seat. “I’m Nobody,” he said by way of introduction before holding up a badge that caught the light, identifying him as a member of an Australian police force (I saw it up close afterwards and can confirm its authenticity). But before I get to his remarks, it would be inappropriate if I allowed him to steal the spotlight from the excellent cast of scheduled panelists, which put Officer Nobody’s appearance in some context. But both Pearce and Scheurle took plenty of control of their narratives.

That was when I noticed a man in the front row with a hat frown, lean in, and fidget. Then, someone did. Boccamazzo (or Dr. Still, Dr. Twitch Chat Racism Changed Hearthstone Pro Terrence Miller's Career. 'League of Legends' Pro Fined $2,000 for 'Racially Insensitive Language' League of Legends already has a reputation for having a toxic community, but fortunately that tendency rarely manifests itself in its worst forms in its professional esports matches.

'League of Legends' Pro Fined $2,000 for 'Racially Insensitive Language'

Yet some pro players apparently can't help themselves. Take Hankil "Road" Yoon, the Support player for the Chinese team I May. Video: Fixing toxic online behavior in League of Legends. Courtesy of the GDC Vault, this free GDC 2013 lecture features Riot Games' Jeffery Lin exploring how to correct toxic online behavior, and how to avoid losing League of Legends players to this bad behavior.

Video: Fixing toxic online behavior in League of Legends

Riot gathered a team of specialists and researchers to analyze the problem and implement several experiments designed to improve players' experiences. Session Name: The Science Behind Shaping Player Behavior in Online Games Speaker(s): Jeffrey Lin Company Name(s): Riot Games. Podcast 4: Toxic Behavior. I think most of us have been there: we join an online multiplayer game and suddenly someone is screaming all kinds of nasty things at us, telling us to die in a fire, or spamming us with some hateful string of letters or another.

Podcast 4: Toxic Behavior

This sort of toxic behavior is particularly bad in some parts of the gaming scene, and it has always struck me as weird. Why are we so willing to bully, harass, and jeer at people in ways that we would never consider in real life? And perhaps a more interesting question: what can game developers do about it? My guest for this episode of the podcast is Dr. Jeffrey Lin, who heads up the Player Behavior Team at Riot Games. What’s cool is that the approaches and methods used by the Player Behavior Team are firmly rooted in some very basic (and some not so basic) theories of human psychology.

To get the podcasts delivered straight to your device of choice, search for “Psychology of Games” or use one of these links: iTunes link (review and rate, please!) Audio Credits: