Navigating the Impact of Manipulation and Removal of Federal Data
Federal datasets have been manipulated and removed since the beginning of the Trump administration and continue to be under threat. In this piece, three data experts explain the critical consequences of this loss.
This article was republishedfrom The Journalist’s Resource.
As the chief demographer at the nonprofit Data Center in New Orleans, Allison Plyer receives hundreds of data requests and questions each year.
The Data Center compiles and analyzes data from multiple sources, including the federal government, with a focus on Southeast Louisiana. And to Plyer, the consistent data requests are a clear example of why federal data is critical infrastructure for media, policymakers and citizens.
Federal datasets have been manipulated and removed since the beginning of the Trump administration and continue to be under threat. Shortly after President Trump took office for the second time, his administration took down several health websites that included Diversity, Equity and Inclusion initiatives and data, although some have since been restored. Researchers have also found undocumented changes to some federal health datasets. And there are many more examples.
This loss has several critical consequences, affecting national infrastructure, accountability, data quality and public trust, explained Plyer and co-panelists Denice Ross, the U.S. chief data scientist and the deputy U.S. chief technology officer during the Biden Administration, and Erica Groshen, senior economic advisor at the Cornell University School of Industrial and Labor Relations, and the 14th commissioner of the Bureau of Labor Statistics from 2013 to 2017.
“In the information age, our federal statistical system is as important an infrastructure as roads and bridges,” Groshen said.
The panelists shared several examples of how federal data informs businesses and impacts Americans.
Disaster response leaders use the Census Bureau’s American Community Survey data to determine how many people are likely affected by a disaster and then measure the recovery in the years afterward, said Plyer, who is also co-chair of the Census Quality Reinforcement Task Force, and past chair of the U.S. Census Bureau Scientific Advisory Committee.
Companies like Zillow use the Department of Education’s Civil Rights Data Collection to help homebuyers determine which schools have the best disability resources.
And data from the Environmental Protection Agency’s Safe Drinking Water Information System helps states deploy federal funds to improve drinking water quality where it’s needed most.
Three categories of federal data attacks
Ross, a senior fellow at the National Conference on Citizenship, categorized the current attacks on federal data into three main categories:
Targeted removal of data that is not aligned with the Trump administration’s priorities. These attacks began in late January, garnering wide news coverage. The removals often include individual data elements like gender identity and race, but not complete datasets.
Collateral damage from actions like reductions in the federal workforce, cutting contracts and terminating scientific advisory committees. While not directly targeting specific datasets, these cuts make it harder for remaining staff to collect and publish data. While these collateral damages to the federal data apparatus get less attention, it will take years or decades to recover from them.
Removal of data that reflects poorly on the performance of the Trump administration’s policies.
For instance, in June, the Social Security Administration removed performance data, including information on how long people stay on hold when they call the national call center, soon after the removal of more than 5,000 staff from the agency.
In September, the administration canceled the U.S. Department of Agriculture’s Household Food Security Report, which provided details on where and how people are experiencing food insecurity across the country. The cancellation coincided with new policies that will kick millions of Americans out of the Supplemental Nutrition Assistance Program, Ross said.
Consequences of trust erosion
Declining trust in government and a potential decline in survey response rates are other consequences of attacks on federal data, the panelists said.
Government survey response rates have been declining for years, and the trend was accelerated by the COVID-19 pandemic. This has become a growing concern for agencies that rely on survey data. Declining response rates can have detrimental effects on surveys like the Census Bureau’s American Community Survey or the Decennial Census in 2030.
Groshen said erosion of trust also poses a serious threat to the Bureau of Labor Statistics, which relies on survey responses from businesses to produce documents such as the monthly jobs reports.
“It’s a perfectly reasonable thing for someone who loses trust in BLS data to say, ‘Well, then why should I bother responding to this survey?’” Groshen said. “So you could expect that survey response rates would decline even further and that would be a threat to all of us who depend on that information.”
Threat to national standards for statistical agencies
OMB provides centralized coordination for agencies and programs through the Office of Information and Regulatory Affairs and the Chief Statistician of the United States. In addition, OMB issues regulations to implement federal laws, including laws pertaining to federal data, such as the Evidence Act of 2018. The Trust Regulation of 2024 provides responsibilities of recognized statistical agencies and units — and their parent agencies — to provide relevant, accurate and objective federal statistics in a manner protecting the confidentiality of data providers.
A critical mechanism to enforce the Trust Regulation is via the inspectors general, government officials who conduct independent oversight of agencies through investigations and audits. As individuals, they are to assess overall statistical and parent agency compliance with these responsibilities. Parent agency conformance is critical as all agencies are to enable and support statistical agency compliance.
As a body, the Council of Inspectors General was directed by OMB in the Trust Regulation to establish professional standards by which recognized statistical agencies and units would be evaluated for their performance. In this way, they would be responsible for the birth, maturation, and even dissolution of recognized statistical agencies and units.
Without performance criteria established and applied, it is unclear how decisions about the performance of statistical agencies and units would be made objectively and transparently.
Eliminating the Council of Inspectors General eliminates the people who were enforcing a powerful statistical policy regulation in the federal government, “and is a threat to our statistical system as well,” Groshen said.
https://www.youtube.com/watch?v=wd4S-18No4g
Advice for journalists
1. Beware of spin by government officials.
“The simplest thing that you might see would be an injection of spin into [data] release narratives,” Groshen said.
For instance, the monthly jobs report from the Bureau of Labor Statistics requires an automated set of procedures involving computer programs that use raw data to create hundreds of tables associated with each release.
“So, it’s very hard to change those on the fly, and that means that the easier thing to change is actually the narrative around that,” Groshen said.
2. Be aware that some allegations of manipulation may not be valid.
It’s a delicate balance, but Groshen said it’s very important not to fall prey to unwarranted attacks on data, regardless of source. Be skeptical.
“Our statistical system has evolved over time,” Groshen said. “It has responsibilities to continue to evolve.”
Talk to experts to get a historical view and context.
“You have to have grounds for judging whether or not these stories [of manipulation] actually make sense, because the people who are trying to misinform you are relying on you not having enough background to be able to judge that their stories are wrong,” Groshen said.
3. Look out for unannounced changes in data methodology and formats.
Statistical agencies follow a set of practices that are aimed at creating trust in their data. In December 2024, a new federal rule took effect, further strengthening these practices. The rule includes four core duties: to produce timely and relevant statistics, ensure accuracy and credibility, remain free from political influence, and protect the confidentiality of data providers.
The agencies also must announce changes in advance, reasons for changes and when changes will happen, Groshen said.
“If you start to get unannounced changes, then you can worry about why this is happening without the usual transparency,” she said.
4. Look for signs of government agency staff resistance.
This would take the form of document leaks and whistleblowing. People getting fired or transferred may be a sign that they resisted authority.
“When you’re violating the norms within an agency, when you’re going against the culture, then you will have people speaking out,” Groshen said.
5. Keep an eye out for an increase in the number of political appointees to statistical agencies.
At BLS, the only political appointee is the commissioner, Groshen said. All other BLS positions are civil service roles.
“If the new commissioner were to bring in another group of political [appointees], then this would be, I think, at least a flashing yellow light,” Groshen said. “What’s the need for politicals when they’ve never been needed before?”
6. Keep an eye on data that shines a light on federal operations.
Journalists should also be aware of lesser-known datasets that may be at the risk of manipulation or removal.
One example is USA Spending data, which shows federal spending information, including information about awards, such as contracts, grants and loans. This dataset is key for transparency and accountability. It is especially important in areas recovering from disaster where the quality of the disaster response depends on federal money flowing quickly, Ross said.
Another example is datasets around safety net programs like SNAP and Medicaid. States are mandated to report how long it takes to process applications or how long the call center wait times are.
“Those are going to be especially important given the changes in eligibility requirements for some of those safety net programs,” Ross said.
Operational data from Customs and Border Protection is also important. This dataset shows an uptick in officers conducting searches of travelers’ phones and devices, information that is useful for people traveling internationally.
There are more niche datasets, such as the Information Collection Request database from the Office of Management and Budget, which tracks changes to federal forms and surveys, including the removal of gender or race questions from different surveys.
“When these types of datasets disappear, it becomes harder to report on government activities and performance, and it also becomes harder for the public to participate in our democracy,” Ross said.
7. Find the local angle of federal data stories.
Federal data is everywhere.
A football coach who uses a weather app to know when to move the practice inside because the heat index is too high is using federal data.
A farmer using technology the calculate the right amount of irrigation on the crops is using federal data.
Ross and a group of data scientists have been cataloging use cases like these at essentialdata.us to show how specific federal datasets benefit everyday Americans and businesses. The website can be a resource for journalists in their storytelling.
If you see the removal of data on jobs data or food insecurity, for example, then gather anecdotes about people who are affected by job loss or food insecurity.
“Qualitative data can be really powerful in the absence of quantitative data, and pairing the absence of the quantitative with the qualitative can make a strong story,” Plyer said.
8. Seek out diverse data experts.
Seek out diverse data experts, especially local experts who can tell why data matters to their specific community, Ross said.
“Be sure to talk to an expert who uses the specific dataset you’re interested in,” Plyer said. “They will probably know what’s happened, what’s changed, what’s disappeared.”
Reach out to advocacy groups and former federal employees.
“Make a statistician your new best friend,” Plyer said.
Monitor public comments in the Federal Register to find potential sources for your stories.
The Federal Register is published daily by the National Archives and Records Administration. It is the official daily publication of the U.S. government that informs the public about changes to federal law. Public comments are the feedback submitted by the public on proposed rules, regulations. The agencies must consider public comments before issuing a final rule. Public comments can be found within each proposed rule.
9. Remind your audiences that all statistics have limitations.
For now, there’s no evidence that data from the BLS has been manipulated, Groshen said.
But that doesn’t mean all government datasets are perfect. All statistics have limitations. They’re inherently imperfect; they can’t capture the entire picture of what they’re describing, they rely on imperfect data, and risk being misinterpreted. And statistical agencies should be transparent about those limitations. But having limitations doesn’t mean the data isn’t trustworthy.
“Keeping that in mind is important, because these days, a lot of these limitations are being weaponized in a way that’s unwarranted,” Groshen said. “Don’t fall for that kind of trap.”
Reach out to data experts and “Cite your sources and explain why it is that you trust the data,” Groshen said.
Visit our Know Your Research page to learn more about the limitations of data and statistics.
Data and expert resources
Dataindex.us monitors America’s federal data infrastructure. In its newsletter, it shares proposed changes to federal datasets, opportunities for public comments and rapid response webinars about datasets at risk.
Essentialdata.us is a growing collection of short examples of how specific federal datasets benefit everyday Americans and businesses.
Association of Public Data Users is a national network that connects users, producers, and disseminators of government statistical data.
Council of Professional Associations on Federal Statistics is a nonprofit, nonpartisan organization that brings together professional associations, researchers, educators, businesses, and civic groups that rely on federal statistics.
American Economic Association is a nonprofit, nonpartisan, scholarly association committed to the advancement of economics and its contributions to society.
The Census Project is a network of national, state and local organizations that support the inclusive and accurate Decennial Census and American Community Survey.
The Friends of the Bureau of Labor Statistics is a network of users of BLS data and information to facilitate the distribution of information, opinions, and ideas about BLS among users.
Friends of NCHS is a coalition of more than 170 organizations and individuals that support the National Center for Health Statistics.
Data Foundation’s Monthly Evidence Capacity Pulse Reports provide analysis of the evolving landscape of federal evidence infrastructure.
Additional reading
“Expert Views on the Federal Statistical System,” a report published in September 2025 by the U.S. Government Accountability Office, finds that “the federal statistical system faces a critical juncture as it works to modernize and adapt to a rapidly changing data landscape, driven by increasing demand for timely, detailed, and relevant information amid declining survey response rates and rising data collection costs.”
“Principles and Practices for a Federal Statistical Agency,” a report published in 2025 by the National Academies, describes the characteristics of effective federal statistical agencies. The report’s Appendix A provides an overview of the major legislation, regulations and policy guidance that govern the operations of the federal statistical system.
“The Future of Official Statistics” is a 2021 article by Erica Groshen, published in Harvard Data Science Review, in which she shares her thoughts on the future of official statistics in the U.S.
Two scholars offer guidance on covering school corporal punishment, which can result in serious injuries and has, for years, been used disproportionately on Black students and children with disabiliti
Some U.S. schools plan to use federal pandemic relief funds to improve indoor air quality. Journalists covering the issue need to know what the research says about classroom air quality and how pollut
Many people get 'percent change' and 'percentage-point change' confused. Use this tip sheet, featuring insights from data journalism pioneer Jennifer LaFleur, to get it right.
A Washington Post higher education reporter shares insights to help journalists cover the new federal student loan forgiveness program and higher education debt more broadly.