TY - JOUR
T1 - Do Platform Migrations Compromise Content Moderation? Evidence from r/The_Donald and r/Incels
AU - Horta Ribeiro, Manoel
AU - Jhaver, Shagun
AU - Zannettou, Savvas
AU - Blackburn, Jeremy
AU - Stringhini, Gianluca
AU - De Cristofaro, Emiliano
AU - West, Robert
N1 - Funding Information:
Manoel Horta Ribeiro is supported by a Facebook Fellowship Award. Jeremy Blackburn is supported by NSF grants CNS-2114411 and IIS-2046590. Gianluca Stringhini is supported by NSF grants CNS-1942610 and CNS-2114407. Emiliano De Cristofaro is supported by the UK’s National Research Centre on Privacy, Harm Reduction, and Adversarial Influence Online (REPHRAIN, UKRI grant: EP/V011189/1). Robert West is partly supported by a grant from the EPFL/UNIL Collaborative Research on Science and Society (CROSS) Program, the Swiss National Science Foundation (grant 200021_185043), the European Union (TAILOR, grant 952215), and gifts from Google, Facebook, and Microsoft.
Publisher Copyright:
© 2021 ACM.
PY - 2021/10/18
Y1 - 2021/10/18
N2 - When toxic online communities on mainstream platforms face moderation measures, such as bans, they may migrate to other platforms with laxer policies or set up their own dedicated websites. Previous work suggests that within mainstream platforms, community-level moderation is effective in mitigating the harm caused by the moderated communities. It is, however, unclear whether these results also hold when considering the broader Web ecosystem. Do toxic communities continue to grow in terms of their user base and activity on the new platforms? Do their members become more toxic and ideologically radicalized? In this paper, we report the results of a large-scale observational study of how problematic online communities progress following community-level moderation measures. We analyze data from r/The_Donald and r/Incels, two communities that were banned from Reddit and subsequently migrated to their own standalone websites. Our results suggest that, in both cases, moderation measures significantly decreased posting activity on the new platform, reducing the number of posts, active users, and newcomers. In spite of that, users in one of the studied communities (r/The_Donald) showed increases in signals associated with toxicity and radicalization, which justifies concerns that the reduction in activity may come at the expense of a more toxic and radical community. Overall, our results paint a nuanced portrait of the consequences of community-level moderation and can inform their design and deployment.
AB - When toxic online communities on mainstream platforms face moderation measures, such as bans, they may migrate to other platforms with laxer policies or set up their own dedicated websites. Previous work suggests that within mainstream platforms, community-level moderation is effective in mitigating the harm caused by the moderated communities. It is, however, unclear whether these results also hold when considering the broader Web ecosystem. Do toxic communities continue to grow in terms of their user base and activity on the new platforms? Do their members become more toxic and ideologically radicalized? In this paper, we report the results of a large-scale observational study of how problematic online communities progress following community-level moderation measures. We analyze data from r/The_Donald and r/Incels, two communities that were banned from Reddit and subsequently migrated to their own standalone websites. Our results suggest that, in both cases, moderation measures significantly decreased posting activity on the new platform, reducing the number of posts, active users, and newcomers. In spite of that, users in one of the studied communities (r/The_Donald) showed increases in signals associated with toxicity and radicalization, which justifies concerns that the reduction in activity may come at the expense of a more toxic and radical community. Overall, our results paint a nuanced portrait of the consequences of community-level moderation and can inform their design and deployment.
KW - content moderation
KW - deplatforming
KW - fringe online communities
KW - online communities
KW - online radicalization
KW - social networks
UR - http://www.scopus.com/inward/record.url?scp=85117961053&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85117961053&partnerID=8YFLogxK
U2 - 10.1145/3476057
DO - 10.1145/3476057
M3 - Article
AN - SCOPUS:85117961053
SN - 2573-0142
VL - 5
JO - Proceedings of the ACM on Human-Computer Interaction
JF - Proceedings of the ACM on Human-Computer Interaction
IS - CSCW2
M1 - 316
ER -