The 'Digital Town Square' Problem

Douglas Yeung | 26 January 2023
No image

The idea of social media platforms like Twitter functioning as “digital town squares”—that is, public places for free speech and civic discourse—has long been ingrained in the public consciousness. This is because, at least in part, tech leaders have long encouraged us to view them as such.

Twitter's new owner, Elon Musk, who tweeted last fall that he had acquired the platform “because it is important to the future of civilization to have a common digital square,” is only the most recent example.

But while a privately owned public space like Twitter might seem like a space for public discourse—after all, just about anyone can tweet or read tweets—it is not.

Whenever any private company is forced to make choices about the nature of content in the conversation it hosts—whether it be Facebook or Twitter, or a bank that owns a plaza, and must decide what protests to allow there—a company and its leaders determine who gets to say what. The rest of our access to these privately owned public spaces can be—and often is—curtailed, all without us knowing exactly why. Just look at what happened on Twitter last month, when dozens of journalists critical of Musk were suddenly banned from the platform.

The journalist ban is a paradoxical move for Musk, who along with others in the tech world, has long pitched a hyper-libertarian view of free speech as an essential ingredient to an ideal society. According to this worldview, only by removing communication guardrails can a truly free, accessible and open public space emerge. It's not surprising then that, within days of his takeover of Twitter in October, Musk fired most of the company's content moderators and other “trust and safety” roles at the company. Last week, he cut even more workers who oversee global content moderation and handle misinformation.

Exactly how private companies' influence on the so-called public spaces they control affects actual civic outcomes—like voter turnout or civic literacy—isn't well understood. In truly public physical spaces, like city parks or libraries, we know measures like anti-discrimination regulations or accessibility infrastructure like wheelchair-friendly ramps, help ensure that all who enter the space feel comfortable engaging with others.

Twitter is a far cry from that.

Removing all barriers to entry may seem like the way to increase equitable access to public spaces, but in practice, that's not what happens. Musk's dismantling of the trust and safety team at Twitter, for example, undermines the platform's capacity to support productive civic discourse. Many users that Twitter had previously banned for toxic speech began to return, while others directly appealed to Musk to be reinstated. And as conversation quality has suffered, misinformation and hate speech on the platform have proliferated.

Unshackling online toxicity also has global ramifications. This week, election deniers in Brazil stormed top government buildings, echoing the Jan. 6 insurrection at the U.S. Capitol. Leading up to the violence, far-right social media activity surged after Musk fired most of Twitter Brazil's content moderation team, publicly questioning and even personally overseeing some of its decisions.

It turns out that the laws, policies and people that control access and enforce rules of behavior play a crucial role in protecting the spaces that undergird strong communities and foster civic engagement. Firing all the rangers might let anyone walk into national parks, but trails would go unmaintained, trash would pile up, traffic would snarl and the majestic landscapes that attracted everyone to begin with would suffer. At Twitter, the effect has been similar: Removing content moderators has increased toxicity and turned it into a place fewer and fewer users care to spend their time. Over a million new users decided to try Mastodon, an alternate social media platform, with signup surges corresponding to many of Twitter's controversies. Yet far fewer of those people have stayed on the platform—depriving them of some of the virtual community they had previously enjoyed, the ability to educate themselves on civic issues or the opportunity to simply make their voice heard.

This is the risk of ceding so much public space—digital and otherwise—to ownership by private companies. Public stewardship over the spaces that fulfill specific civic needs is a crucial aspect of our democracy. By better defining and measuring access to the spaces that claim to serve this crucial function in our democracy, citizens may decide to reserve more public control over digital town squares. There might be a movement to encourage smaller public spaces or to designate online content moderation as a public function, best filled by public servants.

Striking the right balance for access to public spaces—somewhere between unfettered and overly constrained—can be contentious. And for good reason: It is extraordinarily hard to get right. But if privately owned online platforms are going to operate as part of the civic infrastructure that underpins society, we need to ask ourselves, who's making these decisions? And on whose behalf?

Douglas Yeung is a senior behavioral scientist at the nonprofit, nonpartisan Rand Corp. and a member of the Pardee Rand Graduate School faculty.

This article was originally published on Rand.
Views in this article are author’s own and do not necessarily reflect CGS policy.


Comments