December 22, 2024
us-lawmakers-demand-doj-stop-funding-predictive-police-tools-4
US lawmakers demand the DOJ to stop funding 'predictive' police tools due to concerns of biases and inaccuracies. They call for higher standards and an investigation into the discriminatory impact. #PredictivePolicing #DOJ #LawEnforcement

In a push to address concerns about discriminatory practices in law enforcement, US lawmakers are demanding that the Department of Justice (DOJ) cease providing federal grants for “predictive” policing tools. The lawmakers argue that the DOJ has not properly investigated whether police departments awarded grants have been using the funds to purchase software known for its inaccuracies and potential biases. They urge the DOJ to halt all grants until they can ensure that recipients will not use these tools in a discriminatory manner. This call for higher standards and scrutiny comes as independent investigations have revealed how predictive policing tools often replicate biases against minority communities. The lawmakers also request that an upcoming presidential report investigates the use of these tools and establishes evidence standards for future funding decisions.

US Lawmakers Demand DOJ Stop Funding ‘Predictive’ Police Tools

Introduction

In a recent letter to the United States Department of Justice (DOJ), a group of lawmakers expressed their concerns about the use of federal grants to fund “predictive” policing tools. The lawmakers argue that these AI-based tools not only perpetuate biases observed in US police forces but also undermine civil rights. They are demanding higher standards for federal grants and a thorough investigation into the discriminatory impact of these tools. In this article, we will delve into the background of the issue, examine the lawmakers’ concerns, and explore the need for evidence standards and a presidential report to address the shortcomings of predictive policing tools.

Background Information

The DOJ has been providing federal grants to state and local police agencies to purchase predictive policing software. However, recent investigations have revealed that these tools are often inaccurate and have the potential to exacerbate biases in policing practices. Data from historical crime reports, which often contain falsified information and disproportionately represents arrests of people of color, is used to train these predictive models. This leads to biased predictions, which then inform law enforcement decisions, such as disproportionate stops and arrests in minority neighborhoods.

US Lawmakers Demand DOJ Stop Funding Predictive Police Tools

Lawmakers’ Concerns

Seven members of Congress, led by Senator Ron Wyden of Oregon, expressed their concerns about the DOJ’s police grant program. They argue that the government has not adequately investigated whether the departments awarded grants have purchased discriminatory policing software. The lawmakers urge the DOJ to halt all grants for predictive policing systems until they can ensure that grant recipients will not use such systems in a discriminatory manner. They highlight the DOJ’s obligation to periodically review whether grant recipients comply with Title VI of the Civil Rights Act, which prohibits funding programs shown to discriminate on the basis of race, ethnicity, or national origin.

Demand for Higher Standards

In light of the concerns raised by lawmakers, they are demanding higher standards for federal grants used to fund predictive policing tools. They argue that the DOJ should establish evidence standards to determine which predictive models are discriminatory and reject funding for those that fail to meet these standards. This would ensure that only reliable and unbiased predictive policing tools receive federal funding.

US Lawmakers Demand DOJ Stop Funding Predictive Police Tools

Undermining Civil Rights

Predictive policing tools, when based on biased data and flawed algorithms, have the potential to undermine civil rights. By relying on historical data that reflects biased policing practices, these tools perpetuate the over-policing of predominantly Black and Latino neighborhoods. This not only violates the civil rights of individuals in these communities but also skews statistics on where crimes occur. The lawmakers argue that funding such tools contradicts the DOJ’s mission to uphold justice and fairness.

Biased Predictions and Discrimination

The use of predictive policing tools has been criticized for its reliance on biased data and its tendency to discriminate against certain communities. Independent investigations have shown that these tools, trained on historical crime data, often replicate long-held biases, resulting in inaccurate predictions. Researchers have found that predictive policing models were accurate only around 1 percent of the time, casting doubt on their effectiveness. The lawmakers argue that such biased predictions are dangerous and lead to unjustified stops and arrests in minority neighborhoods.

Request for Presidential Report

To address the shortcomings of predictive policing tools, the lawmakers have requested an upcoming presidential report on policing and artificial intelligence to investigate their use in the US. The report should assess the accuracy and precision of predictive policing models across protected classes, such as race and ethnicity, and examine their interpretability and validity. The lawmakers also emphasize the need to assess the risks posed by a lack of transparency from the companies developing these tools. By including these aspects in the report, policymakers can gain a comprehensive understanding of the impact of predictive policing and take appropriate measures to address any discriminatory practices.

Establishing Evidence Standards

To ensure that only non-discriminatory predictive models receive funding, the lawmakers contend that the DOJ should establish evidence standards. These standards would help determine whether a predictive model is discriminatory or not. By setting clear criteria and guidelines, the DOJ can ensure that federal funds are not used to support tools that perpetuate biases and violate civil rights.

Conclusion

The concerns raised by US lawmakers highlight the need for greater scrutiny and higher standards when it comes to funding predictive policing tools. The reliance on biased data and flawed algorithms has resulted in discriminatory practices that undermine civil rights. By demanding evidence standards, the lawmakers seek to prevent the funding of tools that perpetuate biases and harm marginalized communities. The upcoming presidential report on policing and artificial intelligence presents an opportunity to thoroughly investigate these issues and make informed policy decisions. Ultimately, the goal is to ensure that predictive policing tools are fair, accurate, and unbiased, and contribute to a more just and equitable society.