ORLANDO, Fla. – A new method being used for the first time by the U.S. Census Bureau to protect people’s privacy in 2020 census data could hamper voting rights enforcement and make it harder for congressional and legislative districts to have equal populations, according to a report from two leading civil rights groups.
In test data, the method known as “differential privacy” made smaller counties appear to have more people than they actually did at the expense of more populous counties. It also made counties appear more homogenous than they really are where clear majorities of people have a specific race or ethnic background, according to an analysis conducted by the civil rights groups.
The findings reinforce concerns that differential privacy will lower the quality of the data used for redrawing congressional and legislative districts. They also suggest that the census figures won’t support efforts to protect the power of minority voters and comply with court rulings requiring districts to have equal population numbers.
“Our preliminary findings reveal serious concerns about the impact of differential privacy as currently envisioned by the Bureau on our communities’ ability to attain our fair share of political power,” said the report by the Mexican American Legal Defense and Educational Fund, also known as MALDEF, and Asian Americans Advancing Justice’ AAJC.
The report is the latest warning about the Census Bureau’s introduction of deliberate errors to protect privacy into the 2020 census data that will be used for redistricting later this year.
Differential privacy adds mathematical “noise,” or errors, to the data to obscure any given individual’s identity while still providing statistically valid information. Bureau officials said the change is needed to prevent data miners from matching individuals to confidential details that have been anonymized in the massive data release expected as early as August. It will be applied to race, age and other demographic information in geographic areas within each state.
Since it was first proposed three years ago, the methodology has been criticized by redistricting experts and demographers who fear it will create inaccurate data.
Last month, the state of Alabama and Alabama politicians sued the Census Bureau and the Commerce Department, which oversees the statistical agency, claiming differential privacy appeared to erase the Black voting-age populations in 60 communities in Alabama and made 13,000 city blocks appear to have no adults living in them. A three-judge panel was named to hear the Alabama case, which will fast-track it to the Supreme Court if there’s an appeal.
The Census Bureau says it is still formulating the details, but bureau officials have previously described trying to find “the sweet spot” between data confidentiality and data accuracy. The bureau is continuing to improve the method “to ensure that the published data for the 2020 Census meet legislative, programmatic, and data user needs,” the statistical agency said in a recent newsletter.
Final decisions on the method will be made in June, and the Census Bureau plans to release one more set of test data before then.
Previously, the Census Bureau protected privacy by swapping data on the characteristics of households with other households in areas with very small populations, such as neighborhoods, where it would be easy to identify people based on gender, age, race or ethnic background.
The civil rights groups looked at the impact differential privacy would have on drawing congressional and legislative districts, as well as the ability to enforce the Voting Rights Act which protects the power of minority voters.
They found that the “noise” created jumps in population disparities among congressional districts in 30 of the 43 states that have more than one congressional district. In one district in Virginia, the deliberate errors shifted between 18,000 to 19,000 people into another district, the report said.
The analysis also showed that differential privacy produced data that was less accurate for determining if a racial or ethnic minority group formed a majority in a particular community, potentially diluting their local political power, the report said.
“If the proposed new methodology fails to produce data that are accurate and/or reasonably fit for use, then the new methodology should not be implemented as proposed” and other solutions to address privacy concerns should be used, the report said.