Monday, October 18, 2021

The Disinformation Train – Hey, It’s Not Me!

A group of people jumping off a building

Description automatically generated with medium confidence

Former Facebook data scientist-turned-whistle-blower, Frances Haugen, started the big Facebook reveal with an October 3rd tell-all on CBS 60 Minutes, followed shortly thereafter by testimony before a Senate committee. “The former Facebook product manager for civic misinformation told lawmakers that Facebook consistently puts its own profits over users’ health and safety, which is largely a result of its algorithms’ design that steers users toward high-engagement posts that in some cases can be more harmful.” CNBC.com, October 5th. Conspiracy theories and false narratives, including clearly mounting threats of violence, seem to draw vastly more consumer engagement – hence more exposure to advertising – than more innocent posts and content. Facebook is thus disincented from a wholehearted effort to push those purveyors of false narratives and conspiracies theories off their platform, but they still have to maintain a semblance of trying… to distract Congress and federal regulators.

The ability to use social media to foment political disruption in Western democracies, which has motivated Russia (and now a much more sophisticated China that seems to be graduating a hundred times more computer science experts than the United States), is dramatically out of control. And while Facebook is just one major source of disruption, it does appear to be the largest. Smaller, specialized sites, targeting the most extreme conspiracy theorists, are also quite dangerous. It is doubtful that the January 6th insurrection attempt at the Capitol could have taken place without massive coordination across social media platforms.

The September 26th TechnologyReview.com tells us that “troll farms” during the pre-2020 election US campaign reached 140 million Americans a month on Facebook. Virtually all of the efforts to have artificial intelligence make a preliminary filtered review of Facebook postings, followed up by individual review by engaged “moderators” (most of whom are employees of Facebook outsourced contractors) are focused on those made in the English-language. But out of the 2.85 billion users around the world, Omnicore tells us that 90% of Facebook’s daily active users come from outside the United States and Canada. While these moderators only capture a small fraction of total posts for personal review, those on non-English-speaking sites, even those within the US and Canada, get no review at all. When non-English-speakers get fake news from unfiltered posts, that information often gets repeated in ways that wind up heavily viewed in the United States. 

But what do most Americans believe when it comes to this outpouring of dis- and misinformation across this vast, overconnected Internet? After all, in a Lisa Ling CNN Special Series on conspiracy theories, we learned that half of adult Americans follow at least one conspiracy theory (10/17). The October 8th Associated Press attempted to answer that question: “Nearly all Americans agree that the rampant spread of misinformation is a problem… Most also think social media companies, and the people who use them, bear a good deal of blame for the situation. But few are very concerned that they themselves might be responsible, according to a new poll from the Pearson Institute and the Associated Press-NORC Center for Public Affairs Research.

“Ninety-five percent of Americans identified misinformation as a problem when they’re trying to access important information. About half put a great deal of blame on the U.S. government, and about three-quarters point to social media users and tech companies. Yet only 2 in 10 Americans say they’re very concerned that they have personally spread misinformation.

“More, about 6 in 10, are at least somewhat concerned that their friends or family members have been part of the problem… The survey found that 61% of Republicans say the U.S. government has a lot of responsibility for spreading misinformation, compared with just 38% of Democrats… There’s more bipartisan agreement, however, about the role that social media companies play in the spread of misinformation… According to the poll, 79% of Republicans and 73% of Democrats said social media companies have a great deal or quite a bit of responsibility for misinformation.

And that type of rare partisan agreement could spell trouble for tech giants like Facebook, the largest and most profitable of the social media platforms, which is under fire from Republican and Democratic lawmakers alike… ‘The AP-NORC poll is bad news for Facebook,’ said Konstantin Sonin, a professor of public policy at the University of Chicago who is affiliated with the Pearson Institute. ‘It makes clear that assaulting Facebook is popular by a large margin — even when Congress is split 50-50, and each side has its own reasons.’

“During a congressional hearing Tuesday [10/5], senators vowed to hit Facebook with new regulations after a whistleblower testified that the company’s own research shows its algorithms amplify misinformation and content that harms children… ‘It has profited off spreading misinformation and disinformation and sowing hate,’ Sen. Richard Blumenthal (D-Conn.) said during a meeting of the Senate Commerce Subcommittee on Consumer Protection. Democrats and Republicans ended the hearing with acknowledgment that regulations must be introduced to change the way Facebook amplifies its content and targets users.

“The poll also revealed that Americans are willing to blame just about everybody but themselves for spreading misinformation, with 53% saying they’re not concerned that they’ve spread misinformation… ‘We see this a lot of times where people are very worried about misinformation but they think it’s something that happens to other people — other people get fooled by it, other people spread it,’ said Lisa Fazio, a Vanderbilt University psychology professor who studies how false claims spread. ‘Most people don’t recognize their own role in it.’” 

Those charged with “moderating” Facebook postings (usually outsourced contractors) are witnessing serious psychological effects, often producing symptoms mirroring post-traumatic stress disorder (PTSD). Low pay and long hours make these jobs exceptionally difficult. But unless there is accountability, reassessing the limitation on how much such social media sites can rely on a statutory “safe harbor” that restricts their liability for such postings, the ultimate sacrificial lamb may be democracy itself. How far, exactly, does the First Amendment protect these malevolent practices? How close are such misleading postings to yelling “fire” in crowded theater just for the heck of it?

I’m Peter Dekom, and Congress is hopelessly behind the realities of the toxicity of many forms of technology, but unless these dangerous trends are curtailed and contained, at what point do we simply kiss freedom and democracy goodbye?



No comments: