Ruud Painters, admin of mastodon.worldhad about 100 people on the server before the Twitter acquisition in 2022. New signups saw the number of active users peak at around 120,000 in November, Schilders says. But with all of that new traffic came extra hate speech and obscene content. “I’ve learned of things I didn’t want to know,” Schilders says. By early February, the active user count had dropped to around 49,000 active users—still many more than the server had before.
Schilders has recruited content moderators and has funding from donations in the bank to cover monthly server costs. But he says running the server now comes with added pressure. “You’re kind of a public person suddenly,” he says. He plans to separate his personal account from mastodon.world so he can post more freely without being connected to his admin work.
Part of Mastodon’s appeal is that users have more power to block content they see than on conventional social networks. Server admins make rules for their own instances, and they can boot users who post hate speech, porn, and spam or troll other users. People can block entire servers. But the decentralized nature of Mastodon makes each instance its own network, placing legal responsibility on the people running it.
Admins must adhere to laws governing internet service providers wherever their servers can be accessed. In the US, these include the Digital Millennium Copyright Actwhich puts the onus on platforms to register themselves and take down copyrighted material, and the Children’s Online Privacy Protection Rulewhich covers the handling of children’s data. In Europe, there’s the GDPR privacy law and the new Digital Services Act.
The legal burden on Mastodon server admins could soon increase. The US Supreme Court will consider cases that center on Section 230 of the Communications Decency Act. The provision has allowed tech companies to flourish by absolving them of responsibility for much of what their users post on their platforms. If the court were to rule in a way that altered, weakened, or eliminated the piece of law, tech platforms and smaller entities like Mastodon admins could be on the hook.
“Someone running a Mastodon instance could have dramatically more liability than they did,” says Corey Silverstein, an attorney who specializes in internet law. “It’s a huge issue.”
Mastodon was just one of several platforms that garnered new attention as some Twitter users looked for alternatives. There’s also Post.news, Hive Socialand Spill. Casey Fiesler, an associate professor of information science at the University of Colorado Boulder, says many new social platforms experience fleeting popularity, spurred by a catalyst like the Twitter saga. Some disappear, but others gradually grow into larger networks.
“They’re very difficult to get off the ground because what makes social media work is that’s where your friends are,” Fiesler says. “This is one of the reasons why platform migrations tend to happen more gradually. As more people you know join a platform, you’re more likely to join.”