The Price of Censorship

Hamish McKenzie, Chris Best, and Jairaj

Last year, in an interview with the New York Times, anthropologist Heidi Larson, founder of the Vaccine Confidence Project, said that efforts to silence people who doubt the efficacy of the Covid-19 vaccines won’t get us very far. 

“If you shut down Facebook tomorrow,” she said, “it’s not going to make this go away. It’ll just move.” Public health solutions, then, would have to come from a different approach. “We don’t have a misinformation problem,” Larson said. “We have a trust problem.” 

This point rings true to us. That’s why, as we face growing pressure to censor content published on Substack that to some seems dubious or objectionable, our answer remains the same: we make decisions based on principles not PR, we will defend free expression, and we will stick to our hands-off approach to content moderation. While we have content guidelines that allow us to protect the platform at the extremes, we will always view censorship as a last resort, because we believe open discourse is better for writers and better for society. 

This position has some uncomfortable consequences. It means we allow writers to publish what they want and readers to decide for themselves what to read, even when that content is wrong or offensive, and even when it means putting up with the presence of writers with whom we strongly disagree. But we believe this approach is a necessary precondition for building trust in the information ecosystem as a whole. The more that powerful institutions attempt to control what can and cannot be said in public, the more people there will be who are ready to create alternative narratives about what’s “true,” spurred by a belief that there’s a conspiracy to suppress important information. When you look at the data, it is clear that these effects are already in full force in society.