"Trust is the glue of life. It's the most essential ingredient in effective communication."
~ Stephen Covey
Producing quality information isn’t enough—it has to reach people in ways that are trusted, timely, and meaningful. Disinformation often spreads more effectively than truth because it’s emotionally charged, easy to share, and deeply tailored to its audience. In contrast, credible content is frequently slower, more cautious, and harder to distribute at scale.
To strengthen distribution, we must invest in the networks that communities already trust. That includes nonprofits, faith leaders, community organizers, and family-based networks. These community-centric systems are essential in moments of urgency—whether during an election, a public health crisis, or a natural disaster—when getting accurate information out quickly can save lives or safeguard democratic processes. Simply banning disinformation is not enough, in part because of the speed that disinformation travels, but also because COVID-19 also showed us that people will seek out information from less-regulated sources, like the dark web. Strengthening the resiliency of communities is critical to withstand future incidents.

Recovery effort at Camp Mystic along the Guadalupe River after a flash flood swept through on July 6, 2025, in Hunt, Texas. Photo: Julio Cortez/AP Photo
Distribution infrastructure must also reflect the diversity of the communities it serves. Non-native English speakers, rural populations, and low-income households are often cut off from reliable sources and are more likely to encounter misinformation. Supporting participatory media like community podcasts, civic storytelling, local documentaries, and creative projects can amplify underrepresented voices and restore a shared sense of narrative across the many ways that people now consume information.
Disinformation flourishes in environments where people feel unheard or disconnected. These same conditions give rise to broader societal issues: fragmentation, distrust, and social isolation. We are more polarized than ever, with many communities lacking shared reference points or common understanding. This fragmentation is compounded by loneliness—a growing mental health crisis, particularly among young people and especially young men, some of whom are turning to fringe online spaces that foster extremism or unhealthy dependence on virtual identities.

The path to becoming an extremist is not binary or linear. Research shows that loneliness can trigger stress responses and become self-perpetuating over time, deepening isolation and anxiety. Social media environments often worsen the issue, offering superficial connections while amplifying fear, hostility, and division—all of which reinforce lonely individuals’ vulnerabilities. Increasingly, people are also turning to AI to guide them in deeply personal matters like relationships, finances, and life decisions, blurring the line between human and machine.
How Philanthropy Can Help
Philanthropy can play a catalytic role by funding targeted, high-impact projects that strengthen the connective infrastructure of information—especially in communities that are underserved or overlooked. Nascent efforts to fight polarization through in-person interactions have shown promise. These one-time investments can help elevate trusted local voices, expand access to credible content, and support community-led models for information sharing that reinforce civic trust.
As people increasingly use technology and AI-driven tools for guidance on deeply personal matters—such as finances, health, and relationships—it is essential to establish ethical standards and oversight mechanisms to ensure these systems are transparent, accurate, and do not exploit user vulnerability. Key strategies include:
- Setting ethical guardrails for AI to prevent the amplification of misinformation, reinforcement of bias, or manipulation disguised as credible advice.
- Promoting platform accountability and transparency, especially around algorithmic design and content amplification mechanisms.
- Sustaining policy advocacy to shape regulatory frameworks that prioritize user safety and uphold democratic values in digital spaces.
- Investing in research and monitoring to track technological evolution, assess societal impact, and guide evidence-based policy.
Together, these strategies can shift the focus from reactive moderation to proactive system design that supports truth, trust, and user agency.
