Your Stance against Child Sex Trafficking Is Threatening the National Security of the United States

  • Published
  • By 1st Lt Robert Stelmack, USAF, & Lt Sarah Soffer, USAF

 

The headline of this article is intentionally antagonizing and jarring—odds are, you clicked on this because you have either seen an uptick in posts on social media taking a stance against child sex trafficking or have yourself posted, reposted, or shared said media. That, or you found the title to be incredibly far-fetched. We are here to tell you that the title is not far-fetched, and no, you are not crazy if you have noticed this strange new trend. We are going to ask that you, the reader, read this article in its totality and approach it with an open mind. We do so with the purpose of exposing an online narrative and taking a strike at the heart of it. We will hold this narrative up as a case study in why “narrative warfare” is a big problem. Narrative warfare is defined as operations designed to determine the meaning of information, according to Dr. Ajit Maan, a specialist in narrative warfare and national security studies.1 And do not worry—this article won’t attempt to minimize, apologize for, or even justify the worldwide problem of child sex trafficking. In all reality, the fact that child sex trafficking is a problem is the problem.

Here you are—a sensible person scrolling through your Instagram or Facebook feed when you come upon a post that states something to the effect of “Here are the number of children who are trafficked every year in the United States—share this to get more visibility on the problem and help us save our kids!” You—being a good-natured, common-sense individual—may just go ahead and share that. You share the post, your friends see it, and you move on having done your good deed for the day. However, there is actually a chance you did the opposite.

What you may not have noticed was that the post you shared originated from a Facebook group called “Diͥgiͥtͭaͣl Deͤfeͤnseͤ Diͥvͮiͥsiͥoͦn.” If you took a quick look at this Facebook group, you would discover that it is littered with posts that include narratives besides the current worldwide problem of child trafficking. For example, posts include how the mainstream media is trying to silence the doctors who believe masks aren’t effective; how Bill Gates is developing a COVID-19 vaccine that will place microchips into your body; and how the LGBTQ+ movement is trying to legitimize pedophilia. By the way—all of these narratives are both demonstrably false and can lead to directly harming the US population. It is already well established, for instance, that groups not complying with wearing masks correlates directly to an uptick in COVID-19 infections and deaths.

So the question becomes—what is going on here? Is there a large overlap between people who oppose child sex trafficking and those who are vehemently antivaccination? Can one support one issue without supporting the other? Who is linking these movements together? And perhaps most importantly, how can you, the reader, take a morally appropriate stance against a clear-cut issue such as child sex trafficking while not enabling dangerous narratives that threaten the well-being of the United States?

What we have described above is what those working in this area might call a case study of “narrative warfare” and just how difficult it can be to combat. While the concept of creating, influencing, or spreading narratives with specific objectives in mind is not new, interest in the practice has grown substantially with the rise of social media and online communication. One of the most now well-known examples would be “Pizzagate,” a debunked conspiracy theory created and/or spread by Russian influence operations in connection with their effort to sway the 2016 US presidential elections.

Past and Current Russian Influence Operations

In 2016, the phrase “Pizzagate” entered the American vernacular to describe a now debunked conspiracy theory about a child-trafficking ring being run by Hillary Clinton and her aides out of a downtown Washington pizzeria.2 Pizzagate may not have been created by Russian trolls and bots, but it is one of many narratives Russia spread to sow discord leading up to and during the 2016 presidential election. More than three million tweets published by the Kremlin-backed Internet Research Agency (IRA) were exposed and categorized. These tweets, available for viewing online, demonstrate that the IRA pushed conspiracy theories and narratives including not just Pizzagate but also topics such as chemtrails, President Barack Obama’s birthplace, and others.3 Russian trolls sought to create an abundance of misinformation online to blur the lines between fake and real as well as to increase the divide between political groups in the US and disrupt bipartisan governing efforts. According to BuzzFeed, “pushing conspiracy theories toward the mainstream appears to have been part of that effort, even long after the 2016 election.”4

Now that the US is in another presidential election year, it is fair to assume Russia is using similar techniques to those used four years ago. For reference, during the 2016 election alone, Russian trolls used websites like Facebook, Twitter, and Instagram to push divisive narratives, with more than 61,500 Facebook posts, 116,000 Instagram posts, and 10.4 million tweets.5

Most recently, COVID-19 has become a highly polarized topic in the US. Opinions vary on the seriousness of the disease, its origins, how to protect vulnerable populations, and how and when to resume normal activity. Russia has traditionally sought to widen the gaps between groups in Western countries, and the global pandemic has provided another venue for the IRA and other bad actors to do so. Today, Russia has already been exposed attempting to generate panic and sow distrust through pushing disinformation campaigns surrounding the origin and spread of COVID-19.6 US officials have stated that this Russian disinformation campaign is a part of Russia’s effort to advance false narratives and cause confusion.7 For example, pushing narratives encouraging Americans to forgo wearing masks and not to take the pandemic seriously have amplified distrust of government among the population; caused confusion and discord; and ultimately increased the negative effects of the pandemic to include widespread unemployment, closing of businesses, and deaths due to COVID-19.

Now that we have established both a modus operandi and a motive for Russian influence operations, the question becomes, what does this have to do with child trafficking? In case it hasn’t already become evident, a narrative of child trafficking is being used just like the other denoted narratives and conspiracy theories to influence American audiences. By itself, child trafficking as a narrative provides several opportunities: to downplay the seriousness of a pandemic, to act as a counternarrative to the Black Lives Matter movement, and to push for people to oppose wearing masks. It plays on parents’ fears of something unspeakable happening to their children—touting statistics like “a child in America is 66,667x more likely to be sold to human traffickers than to die of covid” (fig. 1). Child trafficking and abuse is a real crisis, but it is being used to advocate for conspiracy theories and whip up a frenzy of outrage. The social media narratives surrounding trafficking with hashtags like #savethechildren and #childlivesmatter paint a picture of a world where strangers abduct your children and then are able to get away with it because either they are one face in a masked crowd, or your children are. In reality, most children reported missing are returned safely, and of the over 400,000 reported missing, only a small percent (estimated 115 children) are abducted by strangers each year.8

Figure 1. Meme relating to child trafficking narrative from the Instagram account @mother_and_patriot_(accessed July 2020).

One specific example of a conspiracy within the child-trafficking narrative is the Wayfair conspiracy. Pushed by the far-right conspiracy community QAnon, it suggests that Wayfair is trafficking missing children based on the “evidence” of highly priced storage cabinets being sold by the online retailer. These cabinets, listed with girls’ names, led to theories that the cabinets were fronts for trafficking missing children with the same names.9 The Wayfair conspiracy—with its obvious parallels to Pizzagate—went viral in early July and was debunked shortly after.10 Interestingly enough, suspicious social media accounts such as @little.miss.patriot on Instagram appeared around the same time—amassing more than 260,000 followers within a month. Instagram accounts such as @little.miss.patriot and similar accounts push narratives about Pizzagate, the Wayfair conspiracy, and antivaccination myths11 and frequently use hashtags like #childlivesmatter (see fig. 2). It is worth asking these questions: Why do these accounts exist? Who created them? Who is benefiting from pushing people to focus on the emotionally charged topic of child trafficking instead of the pandemic?

Figure 2. Meme relating to the child lives matter narrative from the Instagram account @little.miss.patriot (accessed July 2020).

The origins of the accounts pushing these narratives may be unknown, but it is extremely likely that Russia would choose to amplify these narratives to achieve its aims. In this context, the social media fight against trafficking involves encouraging people to focus on this issue, often at the expense of the pandemic, and to forgo masks—further polarizing groups. This conflict leads to more chaos among the American people during an election year and can divert attention away from other important matters. Using the tragedy of child trafficking to divide people is entirely in line with Russia’s previous influence operations efforts to push disinformation and sow discord among the American population.

Lessons to Take Away

What stands out about the child trafficking narrative versus other narratives amplified by foreign influence operations is how it is being circulated. Previous influence operations, such as Pizzagate, have relied on the spread of a conspiracy theory among individuals who believe it. While posts related to Pizzagate ranged from those that directly addressed the topic to less polarizing posts that may just have hinted at the themes in general, the narrative itself primarily spread through individuals who believed in the conspiracy and shared it among their own social networks. In this manner, Pizzagate spread within enclaves of individuals who were most likely to believe the conspiracy. Eventually, of course, these enclaves would be saturated with believers, and a limited number of new believers to buy into and spread the information could be created. Further, any chance Pizzagate had to extend outside these enclaves could be immediately countered by individuals who would instantly recognize the post as being conspiratorial in nature and could quickly debunk it, decreasing its efficacy.12

What sets the child trafficking narrative apart from Pizzagate is its primary mechanism of spreading. Whereas Pizzagate was an easily recognizable conspiracy spread mostly among its believers, the child trafficking narrative has benefited greatly from being promulgated under the guise of good information. For example, memes on Facebook containing facts about child trafficking will pass most individuals’ “smell tests” regarding content and may be shared by users who otherwise would not engage in spreading conspiracy theories. This method of spread presents three significant problems to those who would oppose bad actor influence operations: a legitimate narrative is easier to spread among most users; the accounts from which posts are being shared may or may not themselves link back to more conspiracy theories; and the narrative itself is legitimate, meaning it is harder to connect to bad actor influence operations and discredit.

First, the narrative of child trafficking is legitimate. Thus, many users who would not otherwise share the underlying contained conspiracy theories may still share posts containing the legitimate narrative on top. This, in turn, strengthens the narrative and other subversive but related conspiracy theories, making them more difficult to contain. Second, many of the legitimate or near- legitimate posts on social media about child trafficking are being shared by pages that have conspiracy theory information on their own pages—such as the earlier mentioned “Diͥgiͥtͭaͣl Deͤfeͤnseͤ Diͥvͮiͥsiͥoͦn.” This tactic hides information about a conspiracy theory behind a nonstandard-looking Facebook page. Much like email phishing techniques employed by cybercriminals and advanced persistent threats, this and similar Facebook pages are seemingly designed to not be of interest to or encourage further investigation by any population other than one likely to consume and believe the information within. In this case, it is conspiracy theories concerning child trafficking and their relationship to other nefarious narratives.13 In combination, these deceptions allow bad actor influence operations to solve the problem of conspiracy theories being confined to specific enclaves, unable to reach further target audiences.

The second problem in using a legitimate narrative as a cover for unscrupulous conspiracy theories is the difficulty it presents in being countered. For example, while it could be easier to debunk a conspiracy such as Pizzagate, it is not easy to stop the spread of a legitimate narrative even if it is being used to disguise more subversive narratives. It is obvious to many that Pizzagate is a conspiracy; it is not immediately apparent that spreading legitimate posts on child trafficking at best opens up the opportunity for conspiracy theories to take root under that narrative, and at worst actually involves sharing a post directly from a source containing destructive conspiracy theories.

Third, the child trafficking narrative is difficult to counter because it relies on people thinking emotionally rather than logically, playing into people’s fears. Much like the annual resurgence of fears of razor blades in children’s Halloween candy, the narrative evokes feelings of fear and powerlessness in parents regarding their children. Additionally, arguments that this narrative is being used maliciously leads to its proponents regarding them as indicating that child trafficking is not a problem.

Ultimately, we seek to provide a bottom-up recommendation to those interested in combatting this influence operation and others like it—as opposed to a top-down solution that might come from some form of executive or congressional action. Because the overarching narrative being spread by social media users is a legitimate one, combatting it directly becomes too costly and difficult. Therefore, the most effective way to counter this narrative is not to attempt to explain to social media users how it cloaks other more destructive conspiracy theories, but to teach social media users good “social media hygiene” regarding the posts they share. In short, the best way to combat the spread of conspiracy theories is to educate social media users about reviewing post sources and opting not to share posts if sources do not seem entirely legitimate. Because it is both difficult—and possibly even unethical—to counter the spread of the overarching legitimate narrative, the next best option to stop the spread of the conspiracy theory is to target not the narrative itself but its conspiratorial sources. This option allows for disseminating legitimate information while combatting the circulation of subversive conspiracy theories through the same social connections.

It is evident that a bad actor—most likely the Russian IRA—has either created or is supplementing the spread of a child trafficking narrative on American social media. Clearly, the spread of this narrative has provided an opportunity to nest seemingly connected conspiracy theories within it, influencing readers to engage in divisive and destructive behavior. To engage in these operations would fall in line with previous Russian influence operations used during the 2016 presidential election. Ultimately, the best way to combat this new operation from a bottom-up perspective is to educate social media users on vetting both the information they are spreading and its source. While we may or may not see a decrease in the spread of posts about the overall narrative, we should see a reduction in the sharing of posts that have sources with conspiratorial information. Thus, we should also see a decline in the number of users who believe or engage in the spread of destructive narratives.

Disclaimer

Opinions, conclusions, and recommendations expressed or implied within are solely those of the authors and do not necessarily represent the views of the Air University, the United States Air Force, the Department of Defense, or any other US government agency. Cleared for public release: distribution unlimited.

1st Lt Robert Stelmack, USAF

Lieutenant Stelmack (BS, United States Air Force Academy) primarily performed research on information and hybrid warfare during his work at the academy. He is currently an information operations officer serving as the chief of the USAF Information Operations Reachback Team at Sixteenth Air Force.

1st Lt Sarah Soffer, USAF

Lieutenant Soffer’s (MS, Missouri State University; MS, Purdue University) research interests include the role of social media in influence activities and organizational support to veteran and military members. She is currently a USAF information operations officer serving as the officer in charge, Military Information Support Operations, Air Forces Southern Command.

Notes

1 Dr. Ajit K. Maan and Paul L. Cobaugh, Narrative Warfare, a Primer and Study Guide (Scotts Valley, CA: CreateSpace Independent Publishing Platform, 2018), https://medium.com/.

2 Gregor Aisch, Jon Huang, and Cecilia Kang, “Dissecting the #PizzaGate Conspiracy Theories,” New York Times, 10 December 2016, https://www.nytimes.com/.

3 GitHub, fivethirtyeight/Russian-troll-tweets, 27 August 2018, https://github.com/.

4 Salvador Hernandez, “Russian Trolls Spread Baseless Conspiracy Theories Like Pizzagate and QAnon after the Election,” BuzzFeed News, 15 August 2018, https://www.buzzfeednews.com/.

5 US Senate, Report of the Select Committee on Intelligence United States Senate on Russian Active Measures Campaigns and Interference in the 2016 U.S. Election, accessed 8 September 2020, https://www.intelligence.senate.gov/.

6 Robin Emmott, “Russia Deploying Coronavirus Disinformation to Sow Panic in West, EU Document Says,” Reuters, 18 March 2020, https://www.reuters.com/.

7 Eric Tucker, “US Officials: Russia behind Spread of Virus Disinformation,” Associated Press, 28 July 2020, https://apnews.com/.

8 Barbara Goldberg, “Missing Children in U.S. Nearly Always Make It Home Alive,” Reuters, 26 April 2012, https://www.reuters.com/.

9 Marianna Spring, “Wayfair: The False Conspiracy about a Furniture Firm and Child Trafficking,” BBC News, 15 July 2020, https://www.bbc.com/.

10 Reuters, “Fact Check: No Evidence Linking Wayfair to Human Trafficking Operation,” 13 July 2020, https://www.reuters.com/.

11 Mel Magazine, “QAnon and Child Trafficking: Is ‘Save the Children’ a Conspiracy?,” accessed 8 September 2020, https://melmagazine.com/.

12 Amanda Robb, “Pizzagate: Anatomy of a Fake News Scandal,” Rolling Stone, 16 November 2017, https://www.rollingstone.com/.

13 Cormac Herley, “Why Do Nigerian Scammers Say They Are from Nigeria?,” Microsoft Research, June 2012, https://www.microsoft.com/.