Sunday, February 20, 2022

The Power of the Programmers

 A person in a garment

Description automatically generated with low confidence

As artificial intelligence expands – and it is fully capable of expanding itself without constant human programming – decisions that affect every aspect of our lives are increasingly automated. We are governed by sets of very complex algorithms, often initiated by master programmers (humans), embellished and tweaked by those programmers, constantly fed swarms and masses of new data based on those algorithms, to enable artificial intelligence to refine and expand those algorithms based on that information … without the necessity of human intervention (machines). We rely on these programs and are forced to “trust” their results even if we don’t have a clue how they work.

But what is an algorithm? Wiki summarizes: “In mathematics and computer science, an algorithm is a finite sequence of well-defined instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations, data processing, automated reasoning, automated decision-making and other tasks.” The programmers’ adage – “garbage in, garbage out” – always applies. Inherent cultural biases, discrimination and preferences sneak into the master equations, impact each and every refinement, human and mechanical. As well as simple errors. But since so much that impacts us is determined by the algorithms in the system, those inherent biases also impact every decision they make and every conclusion they reach.

For malign operatives, like Russians tailoring social media, texts and emails to influence our elections and our opinions, these self-rewriting programs are particularly toxic. But as government and business are having to deal with massive volumes of new data, as they automate their messaging, decisions and responses, we certainly have other “unintended consequences” that are often inappropriate. Sometimes dangerously so. Those supposedly benign algorithms also operate under a veil of secrecy that should have us all concerned.

Yale Law School’s Media Freedom and Information Access Clinic (MFIA), in collaboration with The Connecticut Foundation for Open Government (CFOG) and the Connecticut Council on Freedom of Information (CCFOI), decided to do a deep dive intothree major Connecticut state agencies [departments of Children and Families (DCF), Education (DOE) and Administrative Services (DAS)] have used automated software programs called algorithms to make policy decisions affecting school funding, removing children from families, and hiring state workers, but those agencies are unable or unwilling to fully disclose how these programs make their decisions…”

The resulting report, ALGORITHMIC ACCOUNTABILITY: The Need for a New Approach to Transparency and Accountability When Government Functions Are Performed by Algorithms, was released to the public on January 18th, based in significant part on information generated to the researchers under Freedom of Information Act (FOIA) requests made by the MFIA Clinic. The flaws they gleaned were disturbing.

“The report documents examples outside Connecticut where algorithms produced faulty background checks, made incorrect facial recognition matches, denied benefits erroneously, and wrongfully terminated parental rights. It sought to determine the extent to which the public can know that algorithms used by Connecticut agencies do not suffer from similar flaws.

‘The potential for real harm makes transparency surrounding the government’s use of algorithms critical,’ said MFIA Fellow Stephen Stich [Yale Law, 2017], a co-author of the report. ‘This transparency does not exist in Connecticut today.’

“Speaking to the agencies’ noncompliance with FOIA, Mitchell W. Pearlman, former Executive Director of CT Freedom of Information Commission and CFOG officer, added, ‘Unfortunately, government agencies have always used these same techniques of delay, denial and obfuscation to hinder access to public records that would hold those agencies accountable and shed some needed sunlight on their activities.’” A summary analysis of the report released by the Yale Law School on January 19th.

Governmental agencies really do not want the public to know how these algorithms were created and how they operate. The Yale Law study is nothing more than one tiny example of a pervasive wall of secrecy that shields privately controlled data processing (hi, Facebook, Google, Amazon, Netflix, etc., etc. fans!) … but also infects what should be publicly transparent governmental programs: “Among the three [Connecticut] state agencies, the most incomplete response came from DAS, according to the report. The agency is arguably one of Connecticut’s most powerful state government departments, having broad authority over services that cover employment practices, procurement, facilities management, and technology.

“FOIA requires state agencies to respond to inquiries within four business days. But the report shows that, for several months, DAS ignored the request for information about a new algorithm used in hiring state employees and contractors. Many attempts to set up meetings went unheeded, the report recounts. Ultimately, DAS provided no documentation after invoking FOIA exemptions the report’s authors found dubious and irrelevant.

“The report shows that DOE responded, in part, to an inquiry involving an algorithm used to assign students to public schools, an issue hugely important given the court cases and settlements designed to end racial segregation in Connecticut schools. However, the agency failed to disclose how its school-assignment algorithm worked. There appeared no mechanism to allow parents to challenge its decisions, according to the report.”

If there is a serious threat to our lives from artificial intelligence, it is probably not the rise of well-armed robots purging humanity from the planet. It is much more how our lives are determined by the automated decisions they impose on us, the racial, gender and ethnic biases that are part of their most essential programming fabric.

I’m Peter Dekom, and as unstable as our nation is today, and rapid as change is reconfiguring every aspect of our world, we cannot and should not simply trust the programmers to do what’s right.


No comments:

Post a Comment