Saturday, June 4, 2016
There has been trouble a-brewing as conservatives are picturing a small room of Facebook liberal editors sifting through posted or soon-to-be posted “trending” content, eliminating right-of-center conversations and enhancing left-of-center submissions. People. Not exactly. When you think of the billions of postings on that social media site, the notion of the individual attention needed to look at even a tiny fraction of that content becomes ludicrous. Even perfecting a system of absolute neutrality is beyond possible. It’s just not doable, not the way the world works anymore. Technology, necessary to handle the volume, systems that look at key words, where the highest volume of conversations/postings/reactions embrace the same wording or approach – generating automated trending analysis – is used to learn what most people want to talk about to the fore.
These mathematically-based auto-analytics are called algorithms. The original algorithms are, of course, designed by software engineers, programmers, with their own inherent biases. They have to prioritize which audio-visual or wording triggers get moved up and which get ignored. These can be based on preselected words, combinations, or simple volumes of consumer usage, looking for patterns and recurring configurations. Algorithms can be written to learn and teach themselves as well, the general practice today. Putting it mildly, it’s complicated. More on this later.
Meanwhile, the GOP Chairman of the Senate Commerce Committee, John Thune, has initiated an inquiry into his presumed belief of Facebook’s liberal bias – sending a letter demanding “that Facebook explain how it handles news articles in its ‘trending’ list, responding to a report that staff members had intentionally suppressed articles from conservative sources…
“Algorithms in human affairs are generally complex computer programs that crunch data and perform computations to optimize outcomes chosen by programmers. Such an algorithm isn’t some pure sifting mechanism, spitting out objective answers in response to scientific calculations. Nor is it a mere reflection of the desires of the programmers… We use these algorithms to explore questions that have no right answer to begin with, so we don’t even have a straightforward way to calibrate or correct them.
“The current discussion of bias and Facebook started this month, after some former Facebook contractors claimed that the ‘trending topics’ section on Facebook highlighted stories that were vetted by a small team of editors who had a prejudice against right-wing news sources.
“This suggestion set off a flurry of reactions, and even a letter from the chairman of the Senate Commerce Committee. However, the trending topics box is a trivial part of the site, and almost invisible on mobile, where most people use Facebook. And it is not the newsfeed, which is controlled by an algorithm.
“To defend itself against the charges of bias stemming from the ‘trending topics’ revelation, Facebook said that the process was neutral, that the stories were first ‘surfaced by an algorithm.’ Mark Zuckerberg, the chief executive, then invited the radio host Glenn Beck and other conservatives to meet with him on [May 18th]… But ‘surfaced by an algorithm’ is not a defense of neutrality, because algorithms aren’t neutral… Algorithms are often presented as an extension of natural sciences like physics or biology. While these algorithms also use data, math and computation, they are a fountain of bias and slants — of a new kind.” New York Times, May 19th.
Writing on his Website immediately after the above meeting, conservative journalist, Glenn Beck remarked: “Mark Zuckerberg really impressed me with his manner, his ability to manage the room, his thoughtfulness, his directness and what seemed to be his earnest desire to ‘connect the world’... Based on our research and my personal experience with Facebook, I believe they are acting in good faith and share some very deep, fundamental principles with people who believe in the principles of liberty and freedom of speech.” Still, he noted the general distrust that he and his conservative peers still hold for the social media company, and suggest that there was a strong financial stake for Facebook in reversing that impression. But the analytics can still perplex those who look, sometimes even to those who design the systems.
Code-writing programmers create their programs, sometimes not knowing where their algorithms will ultimately take the process, but sometimes they can be very aware of the expected results (where, for example, priorities in search move up or down depending on how much the posting party is willing to shell out for a priority). As I said, it’s complicated. “If Google shows you these 11 results instead of those 11, or if a hiring algorithm puts this person’s résumé at the top of a file and not that one, who is to definitively say what is correct, and what is wrong? Without laws of nature to anchor them, algorithms used in such subjective decision making can never be truly neutral, objective or scientific.
“Programmers do not, and often cannot, predict what their complex programs will do. Google’s Internet services are billions of lines of code. Once these algorithms with an enormous number of moving parts are set loose, they then interact with the world, and learn and react. The consequences aren’t easily predictable.
“Our computational methods are also getting more enigmatic. Machine learning is a rapidly spreading technique that allows computers to independently learn to learn — almost as we do as humans — by churning through the copious disorganized data, including data we generate in digital environments…
“With algorithms, we don’t have an engineering breakthrough that’s making life more precise, but billions of semi-savant mini-Frankensteins, often with narrow but deep expertise that we no longer understand, spitting out answers here and there to questions we can’t judge just by numbers, all under the cloak of objectivity and science.
“If these algorithms are not scientifically computing answers to questions with objective right answers, what are they doing? Mostly, they ‘optimize’ output to parameters the company chooses, crucially, under conditions also shaped by the company. On Facebook the goal is to maximize the amount of engagement you have with the site and keep the site ad-friendly. You can easily click on ‘like,’ for example, but there is not yet a ‘this was a challenging but important story’ button.
“This setup, rather than the hidden personal beliefs of programmers, is where the thorny biases creep into algorithms, and that’s why it’s perfectly plausible for Facebook’s work force to be liberal, and yet for the site to be a powerful conduit for conservative ideas as well as conspiracy theories and hoaxes — along with upbeat stories and weighty debates. Indeed, on Facebook, Donald J. Trump fares better than any other candidate, and anti-vaccination theories like those peddled by Mr. Beck easily go viral.
“The newsfeed algorithm also values comments and sharing. All this suits content designed to generate either a sense of oversize delight or righteous outrage and go viral, hoaxes and conspiracies as well as baby pictures, happy announcements (that can be liked) and important news and discussions. Facebook’s own research shows that the choices its algorithm makes can influence people’s mood and even affect elections by shaping turnout. ” NY Times.
What exactly can this Senate inquiry produce? I just wonder exactly how legislation could be drafted, how there is the remotest possibility of judicial enforcement, to eliminate electronic bias, human-influenced or otherwise. And then there is the question presented by the First Amendment’s protection of free speech from governmental restrictions. But are computer programs entitled to this constitutional protection? The program writers? Site owners? Hmmmm….
I’m Peter Dekom, and as our analytics pervade almost every segment of our lives, it gets really complicated in a world where laws and regulations just do not work well with these algorithms.