Over the past few months, we have seen how social media sites have become safe havens for far-right opinions. The New York Times explored how YouTube pushed users towards radical content via its recommendation algorithm and ProPublica detailed a Facebook group where ICE employees shared racist memes about immigrants. Reddit has also come under controversy for hosting extremist right-wing content. One social media site that has come out of this debate relatively unscathed is Twitter. Much of this is due to the distributed nature of the social networking site. Users are not clustered into easily identifiable forums or subreddits, instead each account exists as a stand-alone entity. While prominent users with many followers have been banned (like Alex Jones), it is harder to identify networks of individuals promoting extremist content. This is the first large-scale analysis I know of examining far-right extremism on Twitter.
To investigate this issue, I analyzed 297,849 Twitter accounts stemming from two far-right extremist users. My findings identified a network of approximately 19,000 users with high proximity to extremist content. Prominent right-wing figures, such as Charlie Kirk and Ryan Fournier, were found to be just as influential in this network as openly racist accounts.
The top ten most influential accounts within the group of users closest to extremist content can be categorized as unabashedly white nationalist and alt-right troll accounts. These users openly share racist memes and promote white supremacism. Four of the most notable users are @NoWhiteGuilt, @Outsider__14, @esc_press, and @Urbanus_Crusade
In the tweet below, @Urbanus_Crusade lists their “MAGA Agenda” as including “European Homogeneous Population”:
@esc_press shares vintage photos of carefree white individuals often with the caption “press [esc] to go back”, suggesting idyllic times that have been lost. In a now deleted tweet, they revealed their racist agenda by sharing this image:
@Outsider__14 primarily shares Tweets, acting as a hub for distributing extremist content. Here they retweet a user praising the white nationalist Unite the Right protests:
Together these four white nationalist users have 21,607 followers they tweet this type of content out to. Altogether (minus the removed account), the Top 10 accounts have a total of 354,602 followers.
Surprisingly, there were a large number of Turkish users found within the network. The removed user in the top 10 table was a popular Turkish journalist who doesn’t appear to have much to do with right-wing extremism. The Turkish accounts were an unexpected finding. It seems to indicate there is a large number of Turkish individuals who are interested in right-wing themed content. It could even be that some of them are behind these accounts, but at this point I can only speculate.
Once I had an understanding of the type of content in this network, I began to look at the individuals with the most followers. This revealed a number of prominent right-wing personalities:
We find Ryan Fournier, Head of Students for Trump with a PageRank of 16. To put this in perspective, this means that in this network Ryan Fournier is just as influential as someone who tweeted this:
Chuck Callesto, former Congressional Candidate, comes in at 46 and Charlie Kirk, Head of Turning Point USA at 153. Charlie Kirk is immediately followed by @PayYourG0yTax (now suspended), giving you a sense of the company these personalities keep.
Starting in May 2019, I began collecting data on extremist right-wing Twitter accounts. To start, I selected two accounts: @NoWhiteGuiltNWG and @Goy_Talk_USA. Both of these users run YouTube channels where they make videos promoting white nationalist theories and are also very active on Twitter. Below is a tweet from @Goy_Talk_USA to give you an idea of what they promote:
I collected data on the followers of these two accounts, then each of their followers-followers, and so on until I had built a network of 297,849 users. For each user, I calculated the depth they were away from either of the two original extremist users. This is illustrated in the diagram below:
I categorized users in the network into three groups based on their depth. Users 0–2 depths away from the original two extremist accounts were Closest. Users 3 depths away were Medium and users greater than 3 depths away were Furthest. Closest can be interpreted as users closest to extremist content, with Medium and Furthest being removed by several layers of users. In total, I found 19,825 users in the Closest group.
This network is visualized in the image below. The core of the network is the green Closest users. The larger and more moderate red Medium users spread out from them. Finally, we see the teal Furthest users either isolated by themselves or at the end of tendrils of the network.
The next step is to calculate the connectedness of each user in the network using an algorithm called PageRank. This assigns a number to users based on how many times they appear in the network. Users that appear frequently will have higher numbers, while users who appear less frequently will have lower numbers. It can be interpreted as a way to measure how influential someone is in the network.
These findings point to a large group of users promoting extremist right-wing content on Twitter. Other social media platforms have been held accountable for the extremist activity on their platforms and have taken actions to curb this type of content. However, Twitter allows these networks to continue relatively unchecked. All of these accounts are public, meaning anyone is able to view and consume extremist rhetoric with a few clicks. The continued display of this content helps to normalize hate speech and radicalize individuals. My hope is that this research will help shed light onto the size and scope of extremism that exists on Twitter and contribute to the ongoing discussion of extremism on social media platforms.
This project was split into two parts: collecting the data from Twitter and then analyzing it. Below are GitHub repositories for the two steps.