Saturday, June 19, 2021

Algorithmic Addiction

Graphical user interface, application, icon

Description automatically generated

It’s hardly a secret that your online life, from emails to social media, is tracking your every click. Some sites claim that these “cookies” and the analytics and predictors (algorithms) still honor and protect your privacy, operating from an anonymous set of computer instructions and Web messaging. Send an email describing wedding plans, and soon you will get ads, emails and even texts with wedding-related offers. Search a product from an online retailer and watch the spam roll in. Privacy protection? Most of us don’t believe that anymore, even as tech firms, most recently Apple, have released new smart phones that focus on protecting privacy, as their ads clearly tout. They love stories of how federal and state agencies demand smart phone access for targeted individuals, where even with dire criminal activity, they simply refuse to comply… bragging about their unhackability, even by feds. Until there is a widespread malevolent hack, releasing your most personal information for bad actors, some of whom relish the new wave of ransomware.

But those cookies and resulting algorithms aren’t just there to market products and services. They are there to configure your vision of their sites to reflect your personal whims and preferences, to increase your usage of Web accessing devices, to lure and hold your attention on designated sites, to increase your contact with their embedded advertisers, who know you well (anonymously?). It is definitely about the money!

Writing for the June 8th, Yale Insights, Susie Allen points out that Websites prosper with higher traffic, and if what social media sites can create is addictive behavior, that’s exactly what they get: “If you’ve ever delayed sleep to doomscroll on Twitter or checked Instagram just one more time to see if someone else liked that selfie, you know that social media can be a time suck. But is it addictive?

“A growing body of medical evidence suggests it is, economist Fiona Scott Morton [the Theodore Nierenberg Professor of Economics at the Yale School of Management] writes in a new paper, co-authored with James Niel Rosenquist of Harvard Medical School and Samuel N. Weinstein of the Benjamin N. Cardozo School of Law. That has important implications for how regulators should oversee social media platforms. And it also has surprising implications for antitrust enforcement.

“The addictive qualities of social media are compounded by a lack of competition in the industry. When air conditioners compete, the more efficient ones can gain an advantage by advertising their low running costs. But without meaningful social media competition or regulation, companies have little incentive to change the addictive quality of their content.

“‘We don’t want to ban cars because they are dangerous, nor would that be a good solution for social media,’ Scott Morton emphasizes. ‘Instead we limit the danger of cars with tools like speed limits, traffic lights, drivers’ licenses, and seatbelts—and we have lots of competition and choice. In digital media we need to find a way to control the stuff that’s harming us, and our children in particular, while keeping the healthy part.’ She believes smarter antitrust enforcement could help, making room for newer and safer social media platforms in the market as well as more competition.

“For decades, the medical community was hesitant to accept that addiction was possible without the ingestion of a physical substance. But, as Scott Morton and her co-authors write, growing understanding of so-called behavioral addiction has chipped away at that resistance. In fact, gambling addiction is now recognized in the latest edition of the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders.

“Social media and gambling can hijack the brain’s reward system in similar ways, the researchers argue. In the case of gambling, you’ll keep pulling the slot-machine lever even after you’ve lost hundreds of dollars, just in case the next one is a winner; in the case of social media, you’ll get lost in the infinite scroll, no matter what else you should be doing…

“In theory, of course, there’s nothing wrong with spending a lot of time on social media. Companies have argued that the hours we log represent positive engagement with the platform: we like what we’re seeing, and so we stay… But in practice, Scott Morton and her co-authors note, survey data finds that a large number of heavy social media users wish they used social media less because of its negative effects on their lives—a classic tug-of-war between short-term impulses and long-term goals that is a hallmark of compulsive behavior. Early data also links social media use among adolescents to mood disorders and ADHD. The dangers seem particularly acute for girls.

“So, what does this all mean for regulators trying to decide whether social media platforms are engaging in anti-competitive conduct? Baked into antitrust enforcement is the idea of increasing consumer welfare: enforcement ought to make life better for consumers by promoting competition so that goods become cheaper, better, or both…

“And economists have long argued that one especially useful way to look at consumer welfare is through what’s called output—the quantity of goods or services produced in a given market. ‘Historically, we have thought of pro-competitive things as being those that increase output and non-competitive things as those that decrease output,’ Scott Morton explains.

“If the merger of two ice cream companies results in an overall larger ice cream market, then (the basic argument goes) consumers must have benefited, either because ice cream was cheaper and they bought more, or because it was better and they bought more. If the merger reduces the size of the ice cream market, it must have been anticompetitive.

“But the logic of output maximization falls apart when it comes to any addictive product. For someone addicted to, say, OxyContin, giving them more OxyContin represents an increase in output—but it surely doesn’t represent a simple increase in consumer welfare.” 

Scott Morton suggests that encouraging more social media sites (more competition is theoretically the goal of antitrust laws), diluting the impact of the big addicting players, might be the answer, especially sites with no or fewer ads. “With more companies vying for users, Scott Morton explains, they’ll have a greater incentive to differentiate in ways users value. In all kinds of markets—cars, movies, food—companies have thrived by promoting themselves as the safe option. A non-addictive social media platform could have similar consumer appeal.” Yale Insights. Seriously? People will gravitate to interesting sites because they have fewer ads and are thus less likely to apply addition-forming algorithms? Why do I think that approach just might backfire? Thank you professor for underscoring the problem, but there clearly has to be a better way.

The very sustainability of a democratic form of government is obviously under attack, powered by social media. Further, as we discuss the responsibility of social media companies to review their content, eliminate fake news and inciters while still allowing a fair access to expression, as we watch foreign powers use these platforms to spread disinformation and encourage predatory insurrection, as extremists unite violent but scattered and distant factions and lone wolves to aggregate for a malevolent cause, we have one more serious consideration that our legal system is unprepared to contain: serious mental health consequences from massive addiction to social media. Where does the First Amendment fit into this paradigm? Or does it? What are the answers?

I’m Peter Dekom, and just by looking at the role of social media in promoting a highly polarized society with major factions at each other’s throats, well, “Houston, we’ve got a problem,” a huge problem.



No comments:

Post a Comment