The law is just 26 words long. 26 apparently innocuous words. Yet Section 230 of the 1996 Communications Decency Act has been at the center of American political life in the last few years. When he was president, Donald Trump called for its repeal, issued an executive order to curb some of its protections, and even threatened to veto the annual defense bill if Congress didn’t revoke Section 230. As a candidate, President Biden called for the law to be “revoked, immediately.” Summarizing the issue in The New York Times, Shira Ovide asks: “What [is] the fight…about, really[?] Everything. Our anxieties are now projected on those 26 words.”

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (47 U.S.C. § 230).

In this article, we’ll walk you through the controversy on Section 230—explaining how politicians like Trump and Biden, who agree on little else, both ended up criticizing the law. And we’ll tell you how Biograph approaches the controversies around Section 230 differently, reflecting our commitment to responsible free speech. Biograph is a new kind of space on the Internet: we give authors (not users) the connection that social media promises, while protecting their privacy and their dignity.


Section 230 seems like a relatively dull, legal matter. But the law does two crucial things. First, it protects sites like Twitter and Yelp from being sued for things their users post. So if I post something illegal on Twitter—an incitement to violence, say, or something libelous—I can be prosecuted or sued. But Twitter can’t be.

Second, the law gives websites the power to regulate their own platforms. They curate your feed with their algorithms and can decide what to take down. If Twitter removes my Tweet, I can’t sue them for violating my First Amendment rights.

Section 230 went into effect in the mid 1990s, just as the Internet was starting to blossom. It effectively created the rules of the road for Internet companies—and it helped shape their business models. Facebook and Twitter probably wouldn’t exist without it. It gave companies the legal shield they needed to grow.

It also opened the door to abuse, disincentivizing Old Social Media companies from creating healthy, respectful communities on their platforms. Why would they? It’s a lose-lose situation for a site like Twitter. They spend time and money removing users’ posts; in return, all they get are unhappy users. As if there isn’t enough unhappiness on Twitter already.

Facebook and Twitter have become more aggressive in the wake of the 2016 U.S. Presidential Election. Their users are demanding safer, less abusive, more factual Internet spaces. Their response has earned the ire of some powerful users—like Donald Trump after Twitter flagged two of his tweets about mail-in voting.

Conservatives exclaimed censorship—and wanted to revoke the law so that social media companies no longer have the power to regulate their users’ speech. Democrats also didn’t like what they saw, though for almost opposite reasons. They saw an Internet crawling with abuse, misinformation, and hate speech. They want to repeal the law and make companies liable for what people say on their platforms.

The disagreement is fierce. But there’s fundamental consensus. No one is happy with the Internet that Section 230 spawned. Whether you think the Internet needs more free speech or less hate speech, we all agree that things need to change. But how?


Let’s step back from the food fight in Washington and take a closer look at the law itself. Why aren’t social media companies liable for what people post on their sites? The law draws an important distinction: “No provider or user of an interactive computer service shall be treated as a publisher…of any information…” In other words, Twitter isn’t like The New York Times—and doesn’t have the same legal obligations—because Twitter isn’t a publisher.

Many social media sites have reveled in the freedom that Section 230 has afforded them. But Biograph sees things differently. We didn’t start as a platform. We started as a publisher. Before we built our app, we spent years publishing memoirs, family histories, self-help books, and academic treatises.

As book publishers, we take our responsibilities seriously. When someone gives you their book to publish, they’re entrusting you with more than a manuscript. A book is part of the person who writes it, as intimate to them as an organ. Our authors trust us with that part of themselves. In turn, we earn their trust—editing their work carefully, designing a beautiful book, and zealously promoting it.

While building the Biograph App, we leveraged our experience and values as publishers, even as officially we evolved to become a platform. Old Social Media companies, beholden to their ad-based business model, want to have their cake and eat it too by claiming to be platforms but also precisely controlling everything your eyeballs see.

In contrast, Biograph has created a true platform by shunning toxic algorithms that create echo chambers for hate speech and fake news. We do not rank content or manipulate what we want you to see. Instead, we designed our simple chronological feed and features including searchability so authors on the Biograph platform can control their own experience.

To be clear, while we offer publishing tools for authors on our platform, Biograph’s role on the App is decidedly that of a platform facilitator, not a publisher. One of our reasons for creating the App in the first place was to put the power we exercised as publishers into the hands of authors, to empower them to create their own narratives and publish for themselves, on their own terms—rather than depending on us or anyone else.


We provide the tools and experience for you to create and publish your best stuff. We’ll trust you to use these tools responsibly, as an author and self-publisher. In turn, you can trust Biograph to safeguard your stories and defend your rights as authors, publishers, and data subjects.

We mean to heal the broken relationship between social media “users”—we prefer to call them authors—and the platforms they use to represent themselves. For instance, unlike Twitter and Facebook, we never sell our users’ data. Everything you create on our app is private, until you decide to publish it, or not. Old Social Media sites like Twitter and Facebook make everything public by default. The burden falls on you to impose limits on who sees your tweets. Users are always demanding that the platforms hold bad actors accountable; frustrated when—again—the platforms fail to do so.

Consider Spotify’s deceptive public-by-default settings, which require users to relocate and re-select “Private Session” every time you open the app, as if your values and preferences have changed from one hour to the next. Without your informed consent, without trusting you to decide how and when to publicize your listening choices and habits, such practices are designed to dispossess you of your rights as the author and publisher of your own life. Spotify’s tagline is “listening is everything”—though it would be more accurate for them to admit that on their platform, “everything is listening.”

The Biograph difference is clear: everything is private by default. You decide who joins your community. You can use the app in perfect privacy or invite a few friends to collaborate with you. That’s key to establishing mutual trust, in our opinion.

On Biograph, you have the power—though not the obligation—to publish your stuff on the public feed. In any case, you deliberately select your audience for every story. Because all your followers shouldn’t necessarily see everything that you post. Such precision in knowing the audience of everything you create empowers you to grow community on your own terms.


Blockchain is the future of decentralized, democratic publishing on platforms that recognize the rights and responsibilities of authors to represent themselves. Rather than trust a central authority like the New York Times to be the “Newspaper of Record,” or Twitter to adjudicate the facts, we envision a world with a distributed and decentralized ledger of truth, where authority is shared among everyone.

No single company or piece of legislation can solve the fight over Section 230 or create a less toxic Internet. But we can build a model for what that Internet might look like—an Internet built on mutual trust and respect. An Internet that doesn’t rely on the legal technicalities of Section 230. Instead of waiting for the heavy-handed government to punish Old Social Media, join Biograph in our pursuit to redefine New Social Media and preserve our democratic values.

download app
gift a book