Saturday, August 12, 2017
“Firehose of Falsehood”
Last year, before Donald Trump was elected President, Rand Corporation authors, Christopher Paul and Miriam Matthews, asked themselves how the Russian propaganda machine was so overwhelmingly effective with the Russian people when the government disseminated such an overflow of obviously false information. How did they do it?
In a 2016 Rand Report – The Russian “Firehose of Falsehood” Propaganda Model – based on extensive research, the authors drilled down into the answers. Their research is chilling, not so much on the impact of this practice in Putin’s Russia (which is bad enough), but for the parallels of here in the United States, which was not the subject of their report. Let’s start with Trump himself, a man who in the first six months of his term in office has tweeted well over 500 times.
I began the analysis of Donald Trump’s self-described “truthful hyperbole” in my August 2nd Believe Me! blog, where the President seems to have so embraced his proclivity for embellishment and misstatement that his own physical manifestations while making these false or exaggerated statements suggests that at the moment he makes these expressions, orally or in writing, he may actually believe them to be “true”… or “true enough.”
Just one sliver of the Trump presidency, highly reflective of his entire six month tenure in office, says it all: “In a period of less than 26 hours — from 6:31 p.m. on July 24 to 8:09 p.m. on July 25 — President Trump made two fired-up speeches, held a news conference and tweeted with abandon, leaving a trail of misinformation in his wake. [29 false or misleading claims].” The Washington Post, July 26th, which listed in detail each of these statements and presented the irrefutable facts that contradicted them. Way before Trump’s purported but non-existent phone calls from the Mexican President or the leader of the Boy Scouts.
“The New York Times keeps a running tally of the president’s lies since Inauguration Day, and PolitiFact has scrutinized and rated 69 percent of Trump’s statements as mostly false, false, or ‘pants on fire.’ [see above chart].” MotherJones.com, August 4th. With this basic premise – Trump’s pattern of issuing false or misleading statements – as background, the Rand report on Russia becomes that much more interesting.
Rand’s own summary of its report: “Since its 2008 incursion into Georgia (if not before), there has been a remarkable evolution in Russia's approach to propaganda. The country has effectively employed new dissemination channels and messages in support of its 2014 annexation of the Crimean peninsula, its ongoing involvement in the conflicts in Ukraine and Syria, and its antagonism of NATO allies. The Russian propaganda model is high-volume and multichannel, and it disseminates messages without regard for the truth. It is also rapid, continuous, and repetitive, and it lacks commitment to consistency. Although these techniques would seem to run counter to the received wisdom for successful information campaigns, research in psychology supports many of the most successful aspects of the model. Furthermore, the very factors that make the firehose of falsehood effective also make it difficult to counter. Traditional counterpropaganda approaches will likely be inadequate in this context. More effective solutions can be found in the same psychology literature that explains the surprising success of the Russian propaganda model and its messages.”
Based on extensive supporting research, the Rand authors conclude:
Experimental research shows that, to achieve success in disseminating propaganda, the variety of sources matters:
· Multiple sources are more persuasive than a single source, especially if those sources contain different arguments that point to the same conclusion.
· Receiving the same or similar message from multiple sources is more persuasive.
· People assume that information from multiple sources is likely to be based on different perspectives and is thus worth greater consideration.
The number and volume of sources also matter:
· Endorsement by a large number of users boosts consumer trust, reliance, and confidence in the information, often with little attention paid to the credibility of those making the endorsements.
· When consumer interest is low, the persuasiveness of a message can depend more on the number of arguments supporting it than on the quality of those arguments.
Finally, the views of others matter, especially if the message comes from a source that shares characteristics with the recipient:
· Communications from groups to which the recipient belongs are more likely to be perceived as credible. The same applies when the source is perceived as similar to the recipient. If a propagandachannel is (or purports to be) from a group the recipient identifies with, it is more likely to be persuasive.
· Credibility can be social; that is, people are more likely to perceive a source as credible if others perceive the source as credible. This effect is even stronger when there is not enough informationavailable to assess the trustworthiness of the source.
· When information volume is low, recipients tend to favor experts, but when information volume is high, recipients tend to favor information from other users.
· In online forums, comments attacking a proponent’s expertise or trustworthiness diminish credibility and decrease the likelihood that readers will take action based on what they have read.
The experimental psychology literature suggests that, all other things being equal, messages received in greater volume and from more sources will be more persuasive. Quantity does indeed have aquality all its own. High volume can deliver other benefits that are relevant in the Russian propaganda context. First, high volume can consume the attention and other available bandwidth of potentialaudiences, drowning out competing messages. Second, high volume can overwhelm competing messages in a flood of disagreement. Third, multiple channels increase the chances that target audiencesare exposed to the message. Fourth, receiving a message via multiple modes and from multiple sources increases the message’s perceived credibility, especially if a disseminating source is one with which anaudience member identifies…
Furthermore, repetition leads to familiarity, and familiarity leads to acceptance:
· Repeated exposure to a statement has been shown to increase its acceptance as true.
· The “illusory truth effect” is well documented, whereby people rate statements as more truthful, valid, and believable when they have encountered those statements previously than when they arenew statements.
· When people are less interested in a topic, they are more likely to accept familiarity brought about by repetition as an indicator that the information (repeated to the point of familiarity) is correct.
· When processing information, consumers may save time and energy by using a frequency heuristic, that is, favoring information they have heard more frequently.
· Even with preposterous stories and urban legends, those who have heard them multiple times are more likely to believe that they are true.
· If an individual is already familiar with an argument or claim (has seen it before, for example), they process it less carefully, often failing to discriminate weak arguments from strong arguments…
· In a phenomenon known as the “sleeper effect,” low-credibility sources manifest greater persuasive impact with the passage of time. While people make initial assessments of the credibility of asource, in remembering, information is often dissociated from its source. Thus, information from a questionable source may be remembered as true, with the source forgotten.
· Information that is initially assumed valid but is later retracted or proven false can continue to shape people’s memory and influence their reasoning.
· Even when people are aware that some sources (such as political campaign rhetoric) have the potential to contain misinformation, they still show a poor ability to discriminate between informationthat is false and information that is correct..
· Someone is more likely to accept information when it is consistent with other messages that the person believes to be true.
· People suffer from “confirmation bias”: They view news and opinions that confirm existing beliefs as more credible than other news and opinions, regardless of the quality of the arguments.
· Someone who is already misinformed (that is, believes some- thing that is not true) is less likely to accept evidence that goes against those misinformed beliefs.
· People whose peer group is affected by an event are much more likely to accept conspiracy theories about that event.
· Stories or accounts that create emotional arousal in the recipient (e.g., disgust, fear, happiness) are much more likely to be passed on, whether they are true or not.
· Angry messages are more persuasive to angry audiences.
Does this seem shockingly familiar? Does this seem to be a rather full description of much of Trump’s messaging habits? The Trump “Firehose” even takes credit for the work of others: “‘My first order as president was to renovate and modernize our nuclear arsenal. It is now far stronger and more powerful than ever before,’ Trump tweeted Wednesday [8/9]… He did not order the modernization of the nuclear arsenal. President Obama did that in 2014, despite calling for a ‘vision of ... a world without nuclear weapons’ just five years earlier.
“The plan, expected to cost $400 billion through 2024, would upgrade nuclear weapon production facilities, refurbish warheads and build new submarines, bombers and ground-based missiles. It will likely cost more than $1 trillion over the next 30 years, according to outside estimates… Because the sprawling nuclear force will take so long to rebuild, the arsenal is more or less at the same level of strength as it was when Trump took office seven months ago.” Los Angeles Times, August 10th.
But he said… well… as the North Korean threats and counter threats mounted… “Hours after warning North Korea that it would meet ‘fire and fury like the world has never seen’ if its leader, Kim Jong Un, continued to provoke the United States, President Trump said the U.S. nuclear arsenal was stronger ‘than ever before.’” LA Times. Exactly my point. Sometimes ignorance, provocation and “Firehose” falsehoods can actually kill you.
Trump’s book, The Art of the Deal, embraces the notion of exaggeration and factual manipulation as a viable negotiating technique he calls “truthful hyperbole.” But as his pattern of communication, embraced and accepted by his ultra-loyal base as gospel (“take what he says seriously but not always literally”), is now the norm for his administration, given his rather open admiration for Vladimir Putin, has Trump knowingly patterned his communications efforts through this proven-to-be-effective Russian “firehose of falsehood” model… or are his instincts innately driven to the same result? Similar to the North Korean “Firehose”? Either way, the resulting polarization and distortion of our democracy is deeply disturbing. Could these practices eventually unravel our entire governmental structure?
I’m Peter Dekom, and while I fully understand how dictatorships control their populations with disinformation by reason of their direct control over media, I am stunned that in a nation with First Amendment protections for a free press, the same disinformation techniques can be employed with amazing effectiveness.