"We are drowning in information, while starving for wisdom."
~ E.O. Wilson
From Falsehoods to Overload
We are living through a historic inflection point, where the volume, speed, and sophistication of information are outpacing our collective capacity to make sense of it. Misinformation and disinformation blur the boundaries between truth and falsehood, fact and opinion, and credible and manipulative sources. And the crisis is not just misinformation or disinformation—it’s also noise. We are flooded with content, and the saturation makes it harder to think clearly and act confidently.
Disinformation thrives in this noisy environment. Algorithmic feeds, social media virality, and fragmented media ecosystems amplify sensational and deceptive content. But even when falsehoods aren’t maliciously intended, the torrent of information can exhaust attention spans, dilute reliable sources, and foster public confusion or disengagement. This flood of noise and disinformation erodes trust in institutions, weakens democratic participation, undermines public responses to critical issues, and contributes to social division.

Disinformation has surged in recent years, fueled by the rise of artificial intelligence (AI), which enables the rapid creation and spread of realistic false content. Media once trusted as truth—photos, audio, and video—are now easily fabricated and used to deceive and divide. At the same time, COVID-19 pushed even more of our lives online, weakening the in-person connections that support civic and community engagement. Leaders across all sectors are now facing the consequences of these shifts.
Disentangling credible insights from content chaos has become a societal imperative. Excess content not only obscures malicious narratives but also drowns factual or nuanced discourse. When citizens feel overwhelmed, they default to familiar or emotionally gratifying sources, which may not be reliable.
Disinformation and Noise Are Not New
While disinformation may seem like a relatively recent phenomenon, corporations, state actors, and politicians have long used false or misleading narratives to advance their ideological agendas, with examples dating back to ancient Greece. The use of disinformation has evolved alongside advancements of communication technologies, from newspapers to radio to modern day social media. In fact, the term “information overload” appears in academic literature as early as 1964, in writings by Bertram Gross, an American social scientist.
We now live more of our life online than ever before, supercharged by advances in technology. The advent of the internet in the 1990s, the rise of smartphones and social networks in the 2000s-2010s, and the recent arrival of accessible generative AI are driving 24/7 content streams. It is easier than ever to produce, distribute, and consume content.
In this attention economy, the supply of information continues to grow rapidly and competes with other demands for our limited attention. Our ability to create information has grown so great that we are now generating an estimated 402,000 petabytes of data every day and we are showing no signs of slowing down. For context, the entire written works of humanity, across all languages, is estimated to be about 50 petabytes.
Against this backdrop, algorithms and content creators are incentivized to prioritize engagement over accuracy, often amplifying polarizing or controversial content. Companies, celebrities, and politicians now compete directly for public attention, bypassing traditional media gatekeepers. Social media enables anyone to share content globally, often obscuring original sources or blurring the line between organic and paid content. In this personality-driven environment, only the most attention-grabbing content tends to survive—regardless of its truthfulness or impact.
Meanwhile, generative AI, bots, and automation flood platforms with low-quality content to drown out meaningful discourse. AI-generated content was relatively rare until the spring of 2023, when its use began to soar. Further, noise and disinformation are emerging in areas that were previously considered relatively neutral online spaces, like online job sites.
Human psychology compounds the issue because we often deal with cognitive overload by relying on heuristics, gut feelings, and filter bubbles. We are choosing to spend time in an increasingly fractured alternative media environment, where we seek out others with likeminded opinions. In some cases, we are choosing to avoid the news—Reuters Institute’s annual look at digital news found that four in ten respondents say they sometimes or often avoid the news, the highest level they’ve ever recorded.
Real-World Consequences
Disinformation, misinformation, and noise may spread online, but their consequences are deeply felt in the real world. During COVID-19, we saw how information overload and false claims fueled vaccine hesitancy and promoted use of medical treatments with insignificant scientific data, ultimately contributing to increased morbidity. Even before COVID-19, measles began to resurge globally and locally after decades of progress, fueled by vaccine hesitancy linked to disinformation.
Civic and democratic institutions are also under strain: disinformation clouds electoral messaging, discourages voter engagement, and weakens our capacity for policymaking and public service. Americans' trust of the federal government, including Congress and the Supreme Court, continues to decline, and there is growing concern about the direction of higher education.
Using disinformation—like threats against polling places and schools—to sow fear and doubt is easier and cheaper than ever. We are seeing more public health, education, and election officials report facing threats and harassment, reducing their willingness to serve or engage with the public. While the impact is not always immediately felt, the long-term consequences include acute staffing shortages, unstable leadership, and a dangerous normalization of intimidation as a political tactic.
Meanwhile, journalists and fact-checkers struggle to keep up, leading to a drop in institutional trust and increased risks of economic instability and violence. At the societal level, fragmented discourse and echo chambers deepen polarization, leaving communities and individuals without shared truths and more vulnerable to manipulation, especially young and disenfranchised people.
Disinformation doesn’t originate solely from technology; it also feeds on deeper societal fractures—distrust, alienation, and the erosion of common narratives. Rising loneliness, particularly among young men, is driving people toward isolating online spaces, some of which foster extremism or unhealthy reliance on virtual communities, underscoring a growing mental health crisis.
Moment of Opportunity
We are at a turning point. Technological advances are reshaping how we produce, distribute and consume. At the same time, public frustration is mounting, and people are increasingly demanding accountability from policymakers and platforms.
Philanthropy has a critical role to play. With its independence, ability to take risks, and capacity to convene diverse stakeholders, philanthropy can fund bold, long-term solutions that prioritize the public good over the bottom line. It can also act quickly—supporting innovation, strengthening infrastructure, and amplifying trusted voices. Our focus will be practical—ensuring that people have access to credible, relevant, and actionable information.
We also face a landscape where the federal government is reducing its support of efforts to combat disinformation, despite recognition of the risks posed by the spread of disinformation. At the same time, major social media platforms have also reduced or ended their disinformation initiatives, highlighting a growing gap that philanthropy has both the responsibility and opportunity to fill. Ongoing debates about the government’s role in regulating social media and AI will continue to shape technology policy for years to come and despite concerns about how malicious actors might use AI, there remains optimism about its possibilities.
While there’s no single solution, this moment offers a rare opportunity to shape the future of our information ecosystem—and philanthropy can help tip the balance toward trust, connection, and resilience.
Our Focus
The Battery Powered community will explore the theme of Information Paradox through the following guiding question:
How can we ensure that all people have access to reliable information to make fundamental life decisions?
We will focus on three domains:
![]() | Production. Combating disinformation requires more than scrubbing content. It demands investment in the creation of accurate, relevant, and emotionally resonant information. We will support community-based journalism, artistic and cultural expression, public-interest technologies, and media innovation labs—especially those serving communities historically excluded from mainstream media. |
![]() | Distribution. Information must reach people through channels they can trust and rely on, especially during moments of crisis or confusion. We will support efforts that advocate for ethical technology policy, invest in research and monitoring, expand equitable distribution infrastructure, amplify trustworthy content through strategic campaigns, and strengthen trusted messenger networks that deliver credible information. |
![]() | Consumption. Access to information isn’t enough—people need the skills to evaluate and interpret what they encounter in a noisy and charged environment. We will support initiatives that build media literacy and critical thinking skills, provide practical community toolkits and workshops, train digital influencers in responsible content sharing, and counter identity-based disinformation that targets marginalized groups. |
Paths not taken. To maintain focus and avoid duplicative efforts, we will not pursue the following areas:
We will not focus on solutions that solely focus on countering foreign adversaries who use disinformation to destabilize American society. While we may try to inspire policymakers to take action, the scope and scale of these types of disinformation campaigns typically require a whole-of-government response, including diplomatic, intelligence, and national security actors.
We will not fund explicitly partisan or political organizations. While some groups may exploit disinformation for personal gain, we face challenges that span the political spectrum.
We will not seek proposals from for-profit entities. We acknowledge that much of the technological innovation is happening in the private sector, but our focus will be on helping organizations devise strategies to counter disinformation and giving them the flexibility to determine what tools are most appropriate for their goals.
We will not seek solutions that enable organizations to target activity on the dark web, where users can hide their identity and activities. In recent years, the dark web has risen as a way to organize and execute potentially illicit activities, but organizations would assume significant risks and ethical considerations to take on tasks in this arena.
Continue to p. 2: Production →
We gratefully acknowledge Jocelyn Yin, the author of this Issue Brief. A respected expert in defense acquisition and government oversight, Jocelyn has served as a trusted advisor to senior policymakers across the federal government. She began her career in philanthropy before taking on key roles at the Government Accountability Office—Congress’s nonpartisan watchdog—and most recently as chief of staff to the Assistant Secretary of the Air Force for Acquisitions, Technology, and Logistics.
A list of sources cited on this page can be accessed HERE.



