Digital Media Concepts/White Supremacy on the Internet

White Supremacy is a racist ideology wherein white, non Jewish people are believed to be inherently superior to people of all other races and therefor are seen as having the right to dominate society, often to the detriment of other races. White supremacy also extends to the political ideology that the white race should maintain their historical domination in politics and societal institutions. Modern day white supremacy frequently involves the belief that the white race is being threatened by rising numbers of non-white races and that steps should be taken by white people to ensure their survival, privilege, and dominance in society [1].

Futher, modern day white supremacy embraces the ideology of reverse racism; the concept that racial inequality for people of color does not exist, and that white people are equally victimized by racism. In contrast, human rights activist groups assert that reverse racism does not exist, as it ignores the systemic imbalances of power which are inherent to the definition of racism. Activist groups asset that racial prejudice is defined as negative racially motivated stereotypes, whereas racism is racial prejudice which is further backed up by institutions of power (such as the legal and criminal justice system, political system, financial system, educational system, etc) [2]. Racism is also considered to be a system wherein a socially dominant race benefits from the oppression of one of more other races, whether they intend to benefit from such oppression or not.

The development of the Internet and social media platforms have provided new platforms and methods for white supremacists to spread racist propaganda and recruit new members; as it provides a certain level of anonymity and physical safety, and allows information to spread quickly and reach large numbers of people. White supremacy activity online is a form of cyber racism [3]. Rising levels of racially motivated mass shootings in recent years have brought increasing attention to the role of the Internet in the spread of white supremacy and white nationalism. In countries with anti-hate speech laws, governments have struggled to enforce their laws in the online arena.

History edit

 
Neo-Nazis attack an LGBT rights pride parade in Rzeszów

White Supremacy Specific Sites

Since the early 1990's, white supremacists have launched websites in order to use the Inernet to their advantage to spread racist ideology and misinformation such as alternate versions of history, as well as to push the overton window; the range of topics that are tolerated in public discussion [4]. Former KKK Grand Wizards David Duke and Don Black were two of the earliest adopters of the Internet as a tool to spread the ideology of white supremacy. Don Black created the first prominent white supremacy website Stormfront in 1996 with 300,000 initial users where he published a podcast by David Duke. Stormfront functions as an Internet forum for neo-Nazis and focuses on Holocaust denial, antisemitism and Islamaphobic topics. In 1998 David Duke wrote on his website that he “believe(s) that the internet will begin a chain reaction of racial enlightenment that will shake the world by the speed of its intellectual conquest”. In 1999 Don Black purchased the domain name martinlutherking.org in order to damage the reputation of Martin Luther King Jr, one of the first of many white supremacist owned cloaked websites; propaganda websites where authorship is hidden in order to hide a political agenda [5].

In 2013 Andrew Anglin launched The Daily Stormer, a message board for neo-Nazis with a focus on white supremacy, Holocaust denial and advocacy for the genocide of the Jewish people. The Daily Stormers "Troll Army" mobilizes to troll people with whom Anglin disagrees with on the Internet. In 2017 Gab Social Network was launched, with a large percentage of far-right users, including several prominent alt-right figures. The suspect of the October 2018 Pittsburgh Synagogue shooting posted his intention to commit a violent act on Gab immediately before the shooting.[6]

Transition to Fringe Websites

The advent of social media websites during the 2000's and lack of regulation on behalf of social media companies provided a new opportunity for white supremacist to spread racist ideology to increasingly larger numbers of people on websites which were still considered fringe however which were less specifically focused solely on white supremacy. White supremacy material quickly spread through image boards such as 8chan, 4chan and Endchan including the manifestos of several mass shooting gunman.

8chan The Christchurch, New Zealand shooter posted an essay on 8chan and a link to his Facbook live video of the shooting. The gunman responsible for the El Paso shooting in 2019 who said he was inspired by the Chistchurch shooting posted his manifesto on 8chan. While the site took the manifesto down, it was later found on several other social media sites. Five weeks later the shooter responsible for the Poway, California synagogue shooting posted his manifesto to 8chan justifying mass murder and calling for his message to be spread [7].

4chan an image-board website, has been increasingly used to spread white supremacy since 2015, with white supremacy posts peaking after the Charlottseville Unite the Right Rally in 2017 [8].

Transition to Mainstream Social Media

Mainstream Social Media sites have become increasingly utilized by white supremacists to spread racist views and recruit new members as a result of traditional avenues such as 8chan being taken offline. White supremacy material and videos of mass shootings have been found on multiple mainstream social media sites [9], including:

YouTube Several well known white nationalist groups run TouTube channels, including the former grand wizard of the Ku Klux Klan, David Duke.

Reddit The 2019 El Paso shooter's manifesto which was originally posted on 8chan quickly spread to Reddit.

Facebook The 2019 El Paso shooter's manifesto spread to Facebook and Reddit after being taken down by 8chan. The Christchurch, New Zealand mosque shooting which resulted in the death of 51 people was livestreamed by the shooter on Facebook. Facebook responded to by taking the video down 45 minutes after it began, during which time it was copied and shared to Facebook 1.5 million times over the next 24 hours [10].

Search Engines

Search Engines such as Google, Bing and Yahoo link directly to sites affiliated with white supremacy such as Gab, Stormfront and 4chan. The shooter responsible for the 2015 Charleston church shooting which resulted in the death of nine African Americans stated that he became motivated to commit violence against black people after a Google search on the topic of black on white crime led him to white supremacy websites. UCLA associate professor Safiya Umoja Noble's research has found that the autocomplete features of search engines frequently suggest racist material to users, and the search engine algorithms often direct users who are conducting searches on racial issues to white supremacist websites rather than to factually and historically accurate websites [11].

Notable Websites edit

Stormfront

The Daily Stormer

Gab Social Network

8chan

4chan

Legality edit

 
Laws_against_homophobic_hate_crime_and_speech_map_Europe

International Legality edit

United Nations Conventions and Treaties

The International Covenant on Civil and Political Rights (ICCPR) is an international treaty and part of the International Bill of Human Rights. It was adopted by the United Nations General Assembly on December 16, 1966. The ICCPR grants member States of the United Nations the right to limit freedom of expression in cases of hate speech under Article 20, which states that "any advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence shall be prohibited by law" [12]. However, the ICCPR allows U.N. States to form their own interpretation regarding which forms of expression constitute direct "incitement to discrimination, hostility or violence" [13]; resulting in differing legal policies regarding hate speech across U.N. member States. Currently there are thirty one countries that have laws in place prohibiting hate speech to varying degrees.

The International Convention on the Elimination of All Forms of Racial Discrimination (ICERD) is a United Nations Convention adopted on December 21, 1965. Article 4, paragraph (a), states that U.N. state parties: "Shall declare as an offense punishable by law all dissemination of ideas based on racial superiority or hatred, incitement to racial discrimination, as well as all acts of violence or incitement to such acts against any race or group of persons of another color or ethnic origin, and also the provision of any assistance to racist activities, including the financing thereof" [14].

The Committee on the Elimination of Racial Discrimination monitors implementation of the ICERD and has specifically addressed online hate speech in their Recommendation 29 recommends that U.N. States "(r) take measures against any dissemination of ideas of caste superiority and inferiority or which attempt to justify violence, hatred or discrimination against descent-based communities; (s) Take strict measures against any incitement to discrimination or violence against the communities, including through the Internet" [15].

Notable International Legislation

Germany passed the Network Enforcement Act (NetzDG) in June of 2017 in an effort to motivate technology companies to comply with their anti-incitement to hatred laws. The NetzDG law subjects technology companies to fines of up to 50 million euros if they do not take down law-breaking materials within 24 hours of being notified.[16]

France passed a law in July of 2019 similar to the NetzDG in Germany, which subjects technology companies to fines of up to 1.25 million euros if they do not remove hate speech from their platforms within 24 hours of being notified [17].

United States Legality edit

There are no laws in the United States prohibiting hate speech, as the Supreme Court of the United States has ruled that laws criminalizing hate speech would violate the First Amendment of the U.S. Constitution which guarantees freedom of expression [18]. However, speech which calls for "imminent violence" upon a person or group of people is not covered under the first amendment. There is significant controversy within the United States regarding what forms of speech constitute "imminent violence" as well as further controversy regarding whether hate speech should be protected under the first amendment.

Title V of the Telecommunications Act of 1996, otherwise known as The Communications Decency Act (CDA) was the first example of legislation in the United States to address hate speech and defamatory language on the Internet. Section 230 states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider" [19]. This grants technology companies immunity from lawsuits regarding content that is published on their platforms, similarly to book publishers. As a result, technology companies have broad leeway when developing and enforcing their own policies regarding hate speech, defamatory language, and false information shared on their platforms.

Controversy edit

 
White_Supremacy_Is_Terrorism

A 2015 Report by the Pew Research Center indicated that Americans are both the most supportive of free speech in the world, as well as the most tolerant of hate speech in the world. However, rising numbers of racially motivated mass shootings in the United States and around the world between 2015–2019 have resulted in increasing controversy over the protection of free speech, censorship of the Internet, and liability of social media websites over content they publish. 44 percent of generation Z and 43 percent of millenials surveyed in the Cato 2017 Free Speech and Tolerance survey stated that they support anti-hate speech laws [20].

Critics of Section 230 of the United States Communications Decency Act assert that the law should be repealed in order to hold technology companies legally responsible for racist and false information which their users post on their websites; whereas defenders of the law argue that repealing it would make it impossible for tech companies to vet information, which could result in a drastic increase in censorship [21].

Search engines such as Google, Bing and Yahoo have been accused of contributing to the growth of white supremacy and racism as a result of being too hands off in moderating what material appears on their search algorithms. Google has defended their search algorithm stating that they limit offensive materials from those who do not intentionally search it out, but that they do not wish to block users from finding all offensive material if they are intentionally searching for such material. Critics of search engine tactics also argue that search engines receive advertising revenue in exchange for placing paying client's website on the first page of their search results, and as such their search algorithms are not impartial and could be modified to ban or limit exposure to materials which incite racial violence [22].

The American Civil Liberties Union (ACLU) which has historically been one of the most vehement defenders of free speech, has become divided on the topic of free speech after the the Virginia ACLU chapter sued the city of Charlottesville in order to prevent the city from relocating a white-nationalist rally to a location outside the city center. Two days later a white supremacist attacked the anti-racist protest crowd, killing one person and injuring 19 more. Dozens of staff resigned and hundreds more signed a letter to Anthony Romero, the ACLU national executive director, stating that the ACLU was too rigid in defending white supremacists [23].

See Also edit

Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble

Southern Poverty "Change the Terms" Recommendations for Social Media Policies

Citations edit

  1. "White Supremacy". Anti-Defamation League. Retrieved 2019-10-01.
  2. "Myth of Reverse Racism". Alberta Civil Liberties Research Centre. Retrieved 2019-10-01.
  3. "Cyber racism – Semantic Scholar". www.semanticscholar.org. Retrieved 2019-10-01.
  4. Daniels, Jessie (2018-2). "The Algorithmic Rise of the “Alt-Right”". Contexts 17 (1): 60–65. doi:10.1177/1536504218766547. ISSN 1536-5042. http://journals.sagepub.com/doi/10.1177/1536504218766547. 
  5. Daniels, Jessie (2009-07-21). "Cloaked websites: propaganda, cyber-racism and epistemology in the digital era". New Media & Society 11 (5): 659–683. doi:10.1177/1461444809105345. ISSN 1461-4448. https://journals.sagepub.com/doi/10.1177/1461444809105345. 
  6. Roose, Kevin (2018-10-28). "On Gab, an Extremist-Friendly Site, Pittsburgh Shooting Suspect Aired His Hatred in Full". The New York Times. ISSN 0362-4331. Retrieved 2019-10-01.
  7. Glaser, April (2019-08-04). "8chan Is a Normal Part of Mass Shootings Now". Slate Magazine. Retrieved 2019-10-01.
  8. Thompson, Andrew; Thompson, Andrew (2018-05-10). "The Measure of Hate on 4Chan". Rolling Stone. Retrieved 2019-10-01.
  9. Heath, David; Crowe, Kevin. "Sites like Facebook, Google and Twitter allowed white supremacists to flourish. Now what?". USA TODAY. Retrieved 2019-10-01.
  10. Heath, David; Crowe, Kevin. "Sites like Facebook, Google and Twitter allowed white supremacists to flourish. Now what?". USA TODAY. Retrieved 2019-10-01.
  11. McWilliams, James. "Dylann Roof's Fateful Google Search". Pacific Standard. Retrieved 2019-10-01.
  12. "OHCHR | International Covenant on Civil and Political Rights". www.ohchr.org. Retrieved 2019-10-01.
  13. https://www.religlaw.org/docs/16%20Jan%202009.pdf
  14. "OHCHR | International Convention on the Elimination of All Forms of Racial Discrimination". www.ohchr.org. Retrieved 2019-10-01.
  15. "University of Minnesota Human Rights Library". hrlibrary.umn.edu. Retrieved 2019-10-01.
  16. Eddy, Melissa; Scott, Mark (2017-06-30). "Delete Hate Speech or Pay Up, Germany Tells Social Media Companies". The New York Times. ISSN 0362-4331. Retrieved 2019-10-01.
  17. France-Presse, Agence (2019-07-09). "France online hate speech law to force social media sites to act quickly". The Guardian. ISSN 0261-3077. Retrieved 2019-10-01.
  18. Volokh, Eugene (2017-06-19). "Opinion | Supreme Court unanimously reaffirms: There is no 'hate speech' exception to the First Amendment". Washington Post. ISSN 0190-8286. Retrieved 2019-10-01.
  19. "Telecommunications Act of 1996". Federal Communications Commission. 2013-06-20. Retrieved 2019-10-01.
  20. "Skeptics I: Free Speech Attitudes Changing". Heterodox Academy. 2018-03-19. Retrieved 2019-10-01.
  21. Grant, Melissa Gira (2019-08-09). "No Law Can Ban White Supremacy From the Internet". The New Republic. ISSN 0028-6583. Retrieved 2019-10-01.
  22. McWilliams, James. "Dylann Roof's Fateful Google Search". Pacific Standard. Retrieved 2019-10-01.
  23. Blasdel, Alex (2018-05-31). "How the resurgence of white supremacy in the US sparked a war over free speech". The Guardian. ISSN 0261-3077. Retrieved 2019-10-01.