U.S.-funded “anti-misinformation” groups persist, raising concerns over censorship and bias
- Despite previous political promises to curb government-backed censorship, several NSF-funded projects have continued under new names and broader missions, now housed within left-leaning institutions and nonprofits.
- Initially designed to persuade vaccine skeptics, the $5 million NSF-funded initiative called Chime In now promotes GMOs, sunscreen safety and raw milk discreditation. It uses an "anti-misinformation" dashboard to guide journalists in countering dissenting viewpoints via targeted social media messaging.
- The NSF-funded Analysis and Response Toolkit for Trust (ARTT) developed an AI chatbot to shape political discourse, particularly around vaccines. It has ties to global institutions like the WHO, WEF and Big Tech (Google, Meta). Now operating as Discourse Labs, its board includes figures linked to left-leaning causes.
- A rapid-response system for journalists, EVT was adopted by the activist group Right To Be, raising concerns about compromised media independence. Its coordinators, trained in progressive advocacy (e.g., LGBTQ+ interventions), may influence journalistic content.
- Critics warn these programs, despite claiming to combat misinformation, risk manipulating public opinion and silencing dissent. Corporate influence (Big Tech, left-wing philanthropies) and ideological bias threaten open debate, challenging democratic principles of free expression.
Despite the Trump administration's initial promises to curtail government-funded censorship, several taxpayer-supported "anti-misinformation"
projects continue to operate under new names and missions, raising concerns about the manipulation of public opinion and the suppression of dissenting voices. These initiatives, originally incubated by the National Science Foundation (NSF) through its Convergence Accelerator program, have been adopted by left-leaning institutions and nonprofits, blurring the line between public health outreach and narrative control.
Chime In: From persuading vaccine skeptics to advocating for GMOs
One such project is Chime In, formerly known as Course Correct, which was established at the University of Wisconsin-Madison with a $5 million grant from the NSF in 2022. Initially designed to persuade vaccine skeptics, Chime In has
expanded its scope to include advocacy for genetically modified (GMO) foods, sunscreen safety and the safety of raw milk. The program uses an "anti-misinformation" dashboard to help journalists identify and counter perceived misinformation networks.
According to the NSF grant description, Chime In aims to "scale Course Correct into local, national and international newsrooms" by developing and rapidly testing messages to reduce the flow of misinformation. The tool enables users to create messaging "experiments" and target specific groups, such as "vaccine skeptics," on platforms like X (formerly Twitter), Facebook, Instagram and YouTube. While Chime In claims to avoid content moderation, it admits to
promoting "independently verified facts" using sponsored social media posts, effectively flooding targeted networks with preferred narratives.
Analysis and Response Toolkit for Trust (ARTT): AI-driven political discourse
Another ongoing project is the Analysis and Response Toolkit for Trust (ARTT), which developed an AI chatbot to guide political discussions, particularly around vaccine hesitancy. ARTT received nearly 750,000 from the NSF in 2021 and an additional 5 million to develop interventions to build trust and address vaccine hesitancy.
ARTT has partnered with several influential organizations, including the University of Washington’s Paul G. Allen School of Computer Science and Engineering, Wikimedia DC and the Children’s Hospital of Philadelphia, which has been criticized for performing transgender surgeries on minors. The project also takes advice from the World Health Organization’s Vaccine Safety Net and is supported by Hacks/Hackers, a nonprofit that works with powerful institutions like the World Economic Forum (WEF), Google, Mozilla and Meta.
ARTT has since "graduated" from the NSF’s Convergence Accelerator and launched its own nonprofit, Discourse Labs, which continues to develop tools and resources to sway social conversations. The nonprofit’s board includes figures with ties to left-leaning organizations, raising
concerns about potential biases in their interventions.
Expert Voices Together: Compromising press independence
Expert Voices Together (EVT) is a rapid-response system for journalists and researchers, designed to provide "trained support coordinators" for one-on-one meetings. The NSF awarded EVT nearly 750,000 in 2021 and 5 million in 2022. The project, led by faculty from multiple universities, including George Washington University and Columbia University, has been
adopted by the left-wing group Right To Be.
Right To Be, known for its training on radical topics such as bystander intervention for LGBTQIA+ support and conflict de-escalation in protest spaces, has taken over EVT. This move has raised concerns about the potential for compromised press independence, as EVT’s support coordinators could influence the content and direction of journalistic work.
The broader implications
The persistence of these government-funded "anti-misinformation" projects under left-leaning institutions highlights the ongoing tension between public health initiatives and the protection of free speech. Critics argue that these programs, while ostensibly aimed at combating misinformation, often serve to manipulate public opinion and stifle dissenting voices.
The involvement of powerful tech companies and left-leaning philanthropies in these projects further complicates the issue, as it raises questions about the influence of corporate and ideological interests on public discourse. As these initiatives continue to operate, they pose a significant challenge to the principles of open debate and the free exchange of ideas, which are fundamental to a healthy democracy.
Conclusion
The continued operation of taxpayer-funded "anti-misinformation" projects under new names and missions underscores the need for greater transparency and accountability in how public funds are used to shape public opinion. As these initiatives expand their reach and influence, it is crucial for policymakers, journalists and the public to remain vigilant and ensure that the tools designed to combat misinformation do not become instruments of censorship and bias. The balance between protecting public health and
preserving free speech remains a delicate and ongoing challenge.
Sources include:
ReclaimTheNet.org
TheFederalist.com