Growth of terrorist groups follows mathematical pattern
Two distinct extremist groups, ISIS and the Boogaloo movement, would seem at first glance to share little in common other than a willingness to commit violence. Yet these groups emerge and grow online following a similar mathematical pattern, according to a new paper from researchers at George Washington University.
The paper proposes a “shockwave equation” that can be applied to a wide number of online groups to predict the point at which they experience sudden growth. The groups include ISIS, which comprises Islamic jihadists, and the Boogaloo movement, a loose collective of right-wing extremists advocating for a new civil war.
“You might think that because of their very different ideologies etc., and the fact that ISIS support was very focused while Boogaloos are diverse, the two movements, ISIS and Boogaloos, would behave very differently. But what we found is that, in fact, they follow the same mathematical blueprint in terms of their growth patterns,” Neil Johnson, a physics professor at George Washington University, told Defense One.
Many extremist groups have benefited from the presence of a specific, charismatic leader. But Johnson and his colleagues’ research shows that growth depends even more on the interpersonal online dynamics of the core members and how they interact with new recruits, a factor he refers to as “collective chemistry.”
To measure the collective chemistry of the groups, the researchers looked at data from ISIS recruitment from the Russian website VKontakte during the group’s key growth phase in 2014 and 2015. They collected Facebook data from the Boogaloo group during 2020.
One of the key qualities of both groups’ collective chemistry is the willingness of new group members to contribute their own content and respond to one another, as opposed to passively consuming content from one leader. In the case of Boogaloo groups, this resulted in an “eclectic mix of memes and ideas.” That, in part, gives the group an authentic, bottom-up feeling, a key factor in propelling it to explosive growth.
“Collective chemistry” is a way to understand how different group members interact and if those interactions are complimentary, like a bag of Lego toys, Johnson said. “Someone in the Boogaloos may be far-right and want to break up federal and state power toward tribes (Aryan tribes). Someone else may be completely off the spectrum, neither right nor left. Each is therefore like a piece of Lego with their own characteristics,” he said.
Having lots of different Lego types that work together lets you know what you can build. The same works for the members of online groups. The researchers measured “the heterogeneity of each bag, based on the postings and comments that it contains, like peering into a bag of Lego. So without having to know everything about each individual piece, we can characterize the heterogeneity of each group,” Johnson said.
Collective chemistry also plays a role in how quickly the group can react to efforts to shut it down.
“Online extremist groups can show remarkably quick growth and adaptation, particularly those focused around fresh narratives…and react quickly when they realize their content is being moderated,” Johnson and his colleagues write.
The analysis could be useful for social networking sites looking to curb online hate groups before they become too big, as well as for law enforcement or intelligence professionals looking to spot such groups when they first emerge. The “typical agency/law enforcement attempts to find the bad actor/apple etc. around which the movement forms, are misguided. Just like there is typically no single ‘bad driver’ that causes a traffic jam. Instead, it is a collective phenomenon. This explains why, when looking for the ‘leader’ in these types of extremism uprisings, there never seems to be one,” Johnson said.
Social media companies in particular could play a proactive role in limiting the size of these groups early, and not just by shutting them down.
“While sweeping shutdowns of online groups are sometimes called for, this tactic has the disadvantage of being highly visible (and thus sometimes provoking and energizing extremists), and also can be circumvented when individuals move to unmoderated platforms,” they wrote.
Rather, they propose that social media companies “nudge” behavior in the groups in the way they select what content to show potential group members and the way they use algorithms to suggest groups to users on the basis of groups already joined.
“One example is by injecting extremists’ online spaces (e.g., Facebook page) with topically diverse material, such as by posting ads and banners that present content about which members of the group are likely to disagree,” they wrote.
Source: Defense One