From the Left

/

Politics

Why Meaningful Algorithm Auditing Is Key to Protecting Civil Rights in the Digital Age

The ACLU on

Employers today rely on artificial intelligence and other automated tools in their hiring processes, including to advertise opportunities, screen applications, assess candidates and conduct interviews. Many of these tools carry well-documented risks of discrimination that can exacerbate existing inequities in the workplace. Employers should avoid tools that carry a high risk of discrimination based on disabilities, race, sex and other protected characteristics, such as personality assessments and AI-analyzed interviews. But where an employer is considering or already using an AI tool, auditing for discrimination and other harms is one critical step to address the dangers that these tools pose and ensure that they are not violating civil rights laws.

A rigorous, holistic audit of an automated tool -- both before and after deployment -- can help employers determine whether to adopt a tool at all, what mitigation measures may be needed and whether they need to abandon it. And public sharing of audits can provide critical information for job applicants, researchers and regulators. However, algorithm audits that aren't carefully crafted can be gamed to present a misleading picture of the system in question or can serve as a cursory box-checking exercise, potentially legitimizing discriminatory systems.

As regulators and legislators increasingly focus on addressing impacts of automated systems in hiring and employment, including creating requirements for auditing, these efforts must be crafted to ensure that they actually increase accountability. While there is no one-size-fits-all approach to algorithm auditing, the audits should:

-- Evaluate the system's performance using carefully selected metrics that consider both when the system works and when it fails.

-- Break down performance for people in different groups, including but not limited to race, sex, age and disability status, and the intersections of those groups.

-- Use data that faithfully represents how the system is used in practice.

-- Be conducted by auditors who are independent from the entity that built or deployed the algorithm.

In many cases, audits can and should be conducted by interdisciplinary teams of subject matter experts, including social scientists, lawyers and policy researchers, that consult with people who will be impacted by these tools and the users of the system itself. Researchers and practitioners have created many different resources describing how to operationalize these kinds of audits.

 

Emerging "bias audits" produced in connection with a recently enacted law in New York City demonstrate why these details are so critical. Because of this law, employers using some of these technologies are required to publish "bias audits" with statistics about how often job applicants advance in the hiring process when an automated tool is used, broken down for people of different races and sexes.

Some news coverage has described this law as requiring employers to "prove their AI hiring software isn't sexist or racist." But a closer look indicates that these audits are incomplete evaluations. First, the requirement only applies to a limited set of the types of tools used in hiring today. So far, we've only been able to locate around a dozen audits -- even though 99% of Fortune 500 companies reportedly use some type of automated system in their hiring processes. The law also doesn't require the audits to assess possible biases related to many characteristics where discrimination in hiring and employment has long been a concern, including disability, age and pregnancy.

When it comes to what's in the audits, the statistics required to be calculated and reported can provide some basic information about which tools employers are using and the number of job applications being evaluated by these tools. But these audits fall short of meaningful transparency in several ways. For one, some don't provide the name or vendor of the tool being assessed. The audits also don't examine whether the tools work as advertised or whether they accurately assess the relevant skills or capabilities needed for a job. In addition, these audits may not fully portray the experiences of candidates or practices of employers. Several, including one of an AI-driven candidate screening tool at ADP and one of an AI-driven applicant scoring tool at Eightfold, are missing a lot of data on candidates who were evaluated by the tool in question.

The published audits also frequently rely on data pooled from multiple employers using the same tool, even though they may be using it in different ways. Companies characterize these audits as designed to "ensure non-discrimination against protected groups," when in fact, this data pooling may mask stark disparities or discriminatory practices.

More generally, algorithm audits should be publicly available and easy to access. Even though employers are required to publish them on their websites, so far, we've found them difficult to locate. That's why we worked with the New York Civil Liberties Union to create a public tracker of all the ones we've seen to date.

As automated systems become more entrenched in our lives, audits can be crucial to identifying and preventing harm. But for that to be the case, audits must be holistic, ongoing and reflective of how these systems are really used. Technologists, civil rights advocates, policymakers and interdisciplinary researchers should work together to ensure that algorithm audits live up to their potential.

Marissa Gerchick is a data scientist and algorithmic justice specialist with the ACLU's Analytics Team. Olga Akselrod is a senior staff attorney in the ACLU's Racial Justice Program. For more than 100 years, the ACLU has worked in courts, legislature, and communities to protect the constitutional rights of all people. With a nationwide network of offices and millions of members and supporters, the ACLU takes on the toughest civil liberties fights in pursuit of liberty and justice for all. To find out more about the ACLU and read features by other Creators Syndicate writers and cartoonists, visit the Creators website at www.creators.com.


Copyright 2023 Creators Syndicate Inc.


 

Comics

Jimmy Margulies Chip Bok A.F. Branco Taylor Jones John Cole Jack Ohman