If you suspend your transcription on amara.org, please add a timestamp below to indicate how far you progressed! This will help others to resume your work!
Please do not press “publish” on amara.org to save your progress, use “save draft” instead. Only press “publish” when you're done with quality control.
Decentralised social media services after spending decades in obscurity, are finally seeing mainstream adoption. While they offer the promise of a communication platform without a corporate overlord, they struggle to be a safe place for many users. In this talk, I explore the ways that decentralised social platforms struggle with moderation and harassment in a way that many previous platforms didn't.
This talk explores what moderation looks like on centralised social services like forums, mailing lists, and corporate websites then shows the additional challenges that come when managing a decentralised service. Particularly, how to protect users from spam and harassment.
I explore how past platforms tried to address these problems, and the ways these past approaches are and are not applicable anymore.
I focus on the ways things like Mastodon struggle to protect its users from harassment, though the criticism is equally applicable to any other
server implementing a similarly decentralised protocol offering similar affordances. I'll discuss what people are doing there now to deal with
these problems, focusing on both technical and social conventions that have evolved over the years.
Finally, I'll give a broad overview of the possible solutions currently being discussed and explored. These solutions ranging from secondary services federating moderation decisions between instance operators to whole new protocols designed to address these concerns directly.