Algorithmic Accountability Advocated in Washington, D.C. Amid Concerns of Discrimination

Webp nnxwcfwazpua447q7lt1evj2enw6

Algorithmic Accountability Advocated in Washington, D.C. Amid Concerns of Discrimination

ORGANIZATIONS IN THIS STORY

Alan Butler Executive Director and President | Official website

Washington, D.C., for example, was using nearly 30 automated decision-making systems to surveil, screen, and score District residents in areas such as public benefits, health care, policing, and housing, according to an investigation by the Electronic Privacy Information Center (EPIC) in 2022. “Automated decision-making is threaded throughout a wide variety of public services in D.C.,” EPIC points out in its report “Screened and Scored in the District of Columbia.”

“They assign children to schools, inform medical decisions about patients, and impact policing decisions about where to patrol and whom to target,” the report states. For example, to assist case workers in determining who gets housing assistance first, the city was using the Vulnerability Index-Service Prioritization Decision Assistance Tool (VI-SPDAT), resulting in unintended racial disparities, EPIC’s investigation showed.

EPIC sent a copy of the report to D.C. Councilmembers, urging them to pass the Stop Discrimination by Algorithms Act, which seeks to prohibit discriminatory use of algorithms to determine an individual’s eligibility for or access to education, housing, or employment.

Read more here.

ORGANIZATIONS IN THIS STORY