Why I started Feed for Thought
At Meta, I tried to fix a content problem. What we really need is a culture shift in how experts participate online.
🎧 Listen | 5:54 mins
The online misinformation problem is often framed as a failure of content moderation: if platforms just removed the bad stuff from people’s feeds, the information ecosystem could stabilize.
Here’s the problem with that: content moderation is a constant game of whack-a-mole. It’s important, but not enough by itself.
Even if it could be done perfectly, it relies on a big assumption: that enough credible information exists online, waiting to show up in people’s feeds. It doesn’t.
I spent years at Meta working to fill this gap with trustworthy content from primary sources. The goal was to lighten the load on overworked moderators and fact-checkers. From 2018 to 2024, I helped hundreds of public service organizations run information campaigns — including governments, UN agencies, universities, and NGOs. Most organizations were learning to use social media for the first time. We focused on critical moments when misinformation could spread: the COVID-19 pandemic, national elections, natural disasters, and the war in Ukraine.
The program worked. Together, we reached over two billion people worldwide. We helped users better identify potential falsehoods and increased their trust in expert information.
Unfortunately, the program ended when I left Meta in 2024.
â—¾
At first, I tried to recreate the program independently. But without major funding, it could not be replicated. So that path closed.
But the problem isn't going anywhere. If anything, it's getting worse. Trust in institutions continues to erode. At the same time, media overload is driving people toward smaller, more private communities — which can feel safer, but makes it harder for centralized sources to reach them. And of course, AI is about to change how people find and assess information in ways we don't fully understand yet.
So I decided to dig deeper.
I started talking to experts across fields that rarely intersect — like medicine, academia, government, and NGOs. I wanted to understand the human impact of today’s changing information landscape. Not just for the public at large, but for the experts themselves.
Through these conversations, I realized that the problem goes beyond content.
Expertise itself is in crisis.
A physician providing care that’s suddenly deemed illegal. A researcher watching her work lose funding and get misrepresented in the same news cycle. A public servant starting over after the agency where he built his dream career disappeared. A humanitarian trying to guide people digging through conflicting advice in chat groups.
These aren’t isolated events. They’re downstream effects of the same underlying shift: in an open and algorithmic environment, authority is no longer owned centrally by institutions, but earned continuously in dispersed spaces across the web.
This changes how people decide what to believe, value, and act on. And those choices have real world consequences: what gets funded, what policies get passed, and whose expertise gets taken seriously.
That’s a major cultural reset. Most experts are navigating it alone.
â—¾
The deeper challenge is that many experts don’t see public engagement as part of their role at all. That makes sense within institutions, where credibility stems from credentials, hierarchy, and peer review. In these settings, public engagement is often either reserved for those at the top of the food chain, or delegated to marketing and comms teams.
The problem is: for everyone else, that model is breaking down. Of course, platforms are marketing tools. But they have also become much more than that: they’re core information infrastructure.
Platforms have fundamentally changed the way people decide what — and who — is relevant to their lives. Personal familiarity and shared values matter as much for trust-building as institutional ties. And as AI flattens both knowledge and nuance, people are turning to messengers who offer lived experience and perspective.
None of that diminishes the value of earned expertise. It simply changes how it needs to show up. You can participate online without being an influencer — in fact, if you’re scrolling at all, you already are.
That’s what Feed for Thought is about.
â—¾
This project has been years in the making, but it’s also still very new. I don’t know where exactly it will go, but here’s what I hope to do:
I hope to name the dynamics you sense, but may not have words for.
Pull back the curtain on how tech products get built and regulated, and what’s coming next.
Have honest conversations about the line between traditional credibility and modern-day trust, and how professional identity is evolving.
And help you find clarity on what intentional participation looks like for you.
â—¾
The funny thing is: I find myself on a similar journey as you. Though I built a career helping others share knowledge online, I never did it myself under my own name. I always thought my expertise applied only within the institutions I worked in.
So here I am. An ex-lurker. Speaking without the filter of my employer, sharing what I know, and joining the conversation.
That feels like a good place to start. â—¾


I’m so excited that you started the audio interface for your publication! Giving it a listen now