background preloader

Online Hate Speech

Facebook Twitter

Facebook and YouTube self-regulation fails in response to racially offensive posts. Sometime on the weekend of September 22, a Facebook page was born: “Let’s End the Māori Race”.

Facebook and YouTube self-regulation fails in response to racially offensive posts

It didn’t get much media coverage, but the message of racist incitement sparked complaints – to the Human Rights Commission, to Māori organisations, but mostly to Facebook itself via “Report story” buttons. At first nothing happened. The page remained. Worse, it sparked Facebook responses, like the “Let’s End the ‘Let’s End the Māori Race” page; a page almost as full of racially-motivated abuse and hatred as the original. People reported that page too. Then, quietly, sometime in the afternoon of Tuesday September 25, the original page disappeared. A victory for self regulation of social media in New Zealand? Not necessarily. Banks, J. (2010) 'Regulating Hate Speech Online', International Review of Law, Computers and Technology, 24, 3: 233-239.

Websites, in direct contrast with many other nations’ approach to hate speech.

Banks, J. (2010) 'Regulating Hate Speech Online', International Review of Law, Computers and Technology, 24, 3: 233-239.

The case of Yahoo! Demonstrates the drift towards nation states imposing geographical demarcationsonto the virtual world and more pertinently highlights the difficulties inherent in Europeancountries seeking to extend their jurisdiction extraterritorially, enforcing their content lawsagainst material uploaded beyond national boundaries.In the landmark case of Yahoo! Inc v. La Ligue Contre Le Racisme et L’Antisemitisme. France's censorship demands to Twitter are more dangerous than 'hate speech'

(updated below) Writing in the Guardian today, Jason Farago praises France's women's rights minister, Najat Vallaud-Belkacem, for demanding that Twitter help the French government criminalize ideas it dislikes.

France's censorship demands to Twitter are more dangerous than 'hate speech'

Decreeing that "hateful tweets are illegal", Farago excitingly explains how the French minister is going beyond mere prosecution for those who post such tweets and now "wants Twitter to take steps to help prosecute hate speech" by "reform[ing] the whole system by which Twitter operates", including her demand that the company "put in place alerts and security measures" to prevent tweets which French officials deem hateful. This, Farago argues, is fantastic, because - using the same argument employed by censors and tyrants of every age and every culture - new technology makes free speech far too dangerous to permit: "If only this were still the 18th century!

That's what always astounds and bothers me most about censorship advocates: their unbelievable hubris. WORLD CONFERENCE AGAINST RACISM,RACIAL DISCRIMINATION, XENOPHOBIAAND RELATED INTOLERANCE. HARMFUL DIGITAL COMMUNICATIONS. Inquiry into Hate Speech. Twitter's Speech Problem: Hashtags and Hate. On October 19, 2012, Twitter turned censor.

Twitter's Speech Problem: Hashtags and Hate

In response to complaints from the Union of French Jewish Students, Twitter pulled tweets that used the hashtag #UnBonJuif, or “a good Jew,” which had been worked into slurs and jokes, some using concentration-camp photographs as illustrations. The student union asked Twitter to reveal information that could be used to identify the offending tweeters. Twitter refused; the social-media company prides itself on its protection of free speech, and argued that taking down the individual tweets was enough. The student group took the case to court. This week, Twitter lost. If Twitter does not hand over the information within two weeks, it will face a fine of a thousand euros (about thirteen hundred dollars) per day—not much for a company recently valued at over eleven billion dollars.

But far bigger than the financial question is that of the First Amendment. Illustration by Jordan Awan. Charlotte Dawson fights back against trolls. Don't feed the trolls. By Rebecca MacKinnon and Ethan Zuckerman An anti-Muslim video demonstrated how politics of fear dominate the online environment.

Don't feed the trolls

It’s time we took action, argue Rebecca MacKinnon and Ethan Zuckerman In September 2012, the trailer for the film The Innocence of Muslims shot to infamy after spending the summer as a mercifully obscure video in one of YouTube’s more putrid backwaters. Protests against the Innocence of Muslims film took place around the world Since then, there has been much handwringing amongst American intellectual, journalistic, and political elites over whether the US Constitution’s First Amendment protections of freedom of expression should protect this sort of incendiary speech, or whether Google, YouTube’s parent company, acted irresponsibly and endangered national security by failing to remove or restrict the video before provocateurs across the Islamic world could use it as an excuse to riot and even kill. From obscurity to widespread outcry.