The Misinformation-Outrage Cycle
This is Part 4. It’s generally best to follow the advice given to Alice and the White Rabbit in Alice’s Adventures in Wonderland: “Begin at the beginning, go on to the end, and then stop.” But if you must read out of order, here are all the links:
- Part 1: There are no Yankees here.
- Part 2: Creating the Conditions for Mainstream Conspiracy Theories.
- Part 3: The Perils of Legal Punditry.
- Part 4: Social Media Makes it Worse
- Part 5: Get the Fighters Fighting and Keep Them Fighting
- Part 6: Invented Narratives and the Outrage Industry
- Part 7: The Outrage Machine Strikes Again: The 14th Amendment Section 3 Debacle
- Conclusion: What To Expect Going Forward
Part 4: Social Media Makes it Worse
Recall from Part 2 that, beginning in the 1980s and accelerating in the 1990s, mass media fragmented. People found themselves in what we might call partisan information ecosystems in which they were offered content that would appeal to them. This fragmenting made it easier for conspiracy theories and misinformation to spread. It also pushed partisans toward more extreme views, thus causing people who were less engaged with politics to ignore politics altogether.
The Internet and social media then fractured audiences into even smaller units of like-minded partisans.
Historians have compared the invention of the Internet to the printing press. From Yale Professor Timothy Snyder: “New media always cause tremendous disruptions. The printing press led to 150 years of religious wars.”
While the printing press didn’t cause the Protestant Reformation, it was the most important driver of the Protestant Reformation by allowing for the widespread dissemination of new information, including misinformation (errors), disinformation (deliberate lies), and propaganda that people were not equipped to evaluate.
The Pew Research Center says this:
Algorithms help good jokes, cute pet videos, and clever quips go viral, which provides a lot of fun. Algorithms also allow groups of like-minded partisans to find each other. Based on who you already follow and the kind of content you engage with, the algorithm will suggest other people to follow. If you find someone on social media who you admire, you can see who that person follows and you can also follow them. Pretty soon—because of the content being served to you by means of algorithms—you can find yourself in a large group of thousands, tens of thousands, or even millions of like-minded people who are continually confirming each other’s biases.
Algorithms also allow content and news to be targeted to individuals with pinpoint precision. Here is an example of how misconceptions can spread:
- Person A sees a Misleading Headline or a headline stating an opinion as if it is a fact.
- Person A clicks on it, skims the article without reading closely and thereby misses the subtlety, and then “likes” it. (Fact: On social media, people tend to scroll quickly, taking in headlines and reacting to them without reading past the headline.)
- The algorithm shows Person A lots of like-minded partisans talking about the opinion as if it is a fact.
- The algorithm shows Person A a post in which a Legal Expert comments approvingly on the Misleading Headline. (The algorithm is smart enough to send Person A an expert who shares Person A’s partisan views.)
- Person A is then persuaded that the Misleading Headline is true.
- Over the coming days, weeks, months—or even longer—Person A’s belief is continually reinforced by members of his or her community.
Person A, by herself, is simply a misinformed person. Multiply Person A by hundreds of thousands of people in a partisan ecosystem, and you have a problem.
Once a person acquires a belief in this manner, it is very difficult to get that person to question the belief.
Some experts post on social media only when they have well-thought-out opinions. Some lawyers who regularly appear on television are careful to stick to the facts and make sure that, when they speculate, it’s clear that they are speculating.
But some experts (to use Peter Arenella’s phrase) fall prey to the seductive power of being anointed a ‘national expert’ on all legal issues. They discover that it doesn’t matter if they are talking about an area of law they know nothing about and didn’t bother researching. They discover that it doesn’t matter if they toss an opinion off the tops of their heads. Anything they say (particularly if they confirm the biases of their followers) will get lots of engagement and they will be heaped with praise.
Also, they learn that there is no collective memory. They can be wrong with impunity, which frees them to be careless.
An Example of a Twitter/ Internet-driven Left Wing Conspiracy Theory
People tend to forget now that during the leadup to the 2020 election, much of left-leaning Twitter was persuaded by a few self-appointed “elections experts” that the voting machines were easily hacked. In fact, there was an article being shown around claiming that a child would be able to hack into the computers. The conspiracy theory was that malicious actors (Republicans) would flip thousands of votes causing Democrats to lose. Later, when the Democrats won, the self-appointed left-wing experts suddenly became experts in something else. For some of my experiences with this particular conspiracy theory, see this post.
Get the Fighters Fighting (and Keep Them Fighting)
Before law school, I taught English and creative writing at the college and university level. It was so long ago that it feels like another lifetime, but I remember Janet Burroway’s book, Writing Fiction: A Guide to Narrative Craft because when I taught the introductory fiction writing class as a graduate student at the University of California, Davis, I was required to use her book as a text.
As everyone who has taken a literature class knows, good stories contain conflict. Burroway offered this advice to fiction writers:
- Get your fighters fighting.
- Make something — the stake — worth their fighting over
Conflict engages readers. Recall from Part 2 that cable news shows learned to use “conflict” programming to engage readers. Social media creates conflict by using algorithms to elevate material that promotes division and creates rage.
The Facebook whistleblower Francis Haugen explained that Facebook algorithms incentivized “angry, polarizing, divisive content.” In her testimony before Congress, she said:
In a 60 Minutes interview, Haugen explained that content that gets engaged with – such as reactions, comments, and shares – gets wider distribution. Facebook’s own research found that “angry content” is more likely to receive engagement. She said that content producers and political parties are aware of this.
Twitter lets a person look at their “analytics.” (Actually, I don’t know if this is still possible.) These analytics allow a person to see which of their posts get the most engagement. People who are driven by a desire to be popular will study their analytics to see what kinds of posts get the most engagement. Then, they will consciously continue doing whatever gets them “likes” and new followers. The material that will get the most engagement either (1) confirms the pre-existing beliefs of their audience or (2) invokes a strong emotion in their audience.
@Catturds and Ben Shapiro enrage the left. Alexandria Ocasio-Cortez enrages the right. Elevating users on this list gets the fighters fighting and keeps them fighting, which of course, stimulates engagement and helps bring in advertising revenue.
There is a thing on Twitter called dunking: One person says something outrageous or painfully stupid. Others re-tweet the statement and dunk on it by adding a clever or snarky statement intended to highlight the outrageousness or stupidity of the statement.
Trump, while a candidate for office in 2016 and as president, demonstrated that the way to get more media coverage is to be as outrageous as possible. This created something we might call the outrage cycle:
- Trump said something outrageous designed to excite his supporters and enrage his critics.
- His critics became enraged.
- His followers felt euphoric to see his critics become enraged.
Others now copy this method. I have seen Ted Cruz, for example, post something completely outrageous designed to enrage his critics, who then “dunk” on him to the delight of their followers.
Both sides think they win a dunking contest: The dunker (usually a Democrat) shows how clever he or she is, and the dunkee (usually a Republican) gets to be the star of a show entitled “Watch Me Trigger the Libs.”
The dunking game drives up partisanship and increases engagement on the social media platform.
Is social media making us all more authoritarian?
In Part 1, I suggested that authoritarian characteristics exist on a continuum:
In Part 3, I showed how the misinformation circulating about the DOJ investigation stimulated three of the characteristics:
- support for conventional values (for example, the concept of the traditional family of man + women + children with each performing traditional roles.)
- authoritarian submission (submitting to perceived authorities)
- authoritarian aggression (“by any means necessary”)
- stereotypy (a tendency to repeat certain words and phrases; think of group chants)
- rigidity
- glorifying toughness and power (and despising bookish weakness)
- cynicism
- projectivity (the view that the world is a dark and dangerous place)
Now let’s talk about stereotypy, another item on the list.
Stereotypy refers to behaviors that are repeated without an obvious goal. When political psychologists discuss the traits of authoritarianism, they use the word stereotypy to refer to the repetition of phrases and a tendency to think in rigid categories.
In this video, Timothy Snyder talks about “Internet Triggers,” which he defines as something a person sees on the Internet, often because an algorithm directed the content to the person. The person then feels triggered and repeats it to someone else, who also feels triggered and in turn repeats the phrase. The people are thus transformed into repeaters of targeted memes and soon you have an Internet Trigger gone viral.