skip to main content
research-article

Towards Intersectional Moderation: An Alternative Model of Moderation Built on Care and Power

Published: 04 October 2023 Publication History

Abstract

Shortcomings of current models of moderation have driven policy makers, scholars, and technologists to speculate about alternative models of content moderation. While alternative models provide hope for the future of online spaces, they can fail without proper scaffolding. Community moderators are routinely confronted with similar issues and have therefore found creative ways to navigate these challenges. Learning more about the decisions these moderators make, the challenges they face, and where they are successful can provide valuable insight into how to ensure alternative moderation models are successful.
In this study, I perform a collaborative ethnography with moderators of r/AskHistorians, a community that uses an alternative moderation model, highlighting the importance of accounting for power in moderation. Drawing from Black feminist theory, I call this "intersectional moderation." I focus on three controversies emblematic of r/AskHistorians' alternative model of moderation: a disagreement over a moderation decision; a collaboration to fight racism on Reddit; and a period of intense turmoil and its impact on policy. Through this evidence I show how volunteer moderators navigated multiple layers of power through care work. To ensure the successful implementation of intersectional moderation, I argue that designers should support decision-making processes and policy makers should account for the impact of the sociotechnical systems in which moderators work.

References

[1]
Carolina Are. 2020. How Instagram's algorithm is censoring women and vulnerable users but helping online abusers. Feminist media studies 20, 5 (2020), 741--744.
[2]
Carolina Are. 2022. An autoethnography of automated powerlessness: lacking platform affordances in Instagram and TikTok account deletions. Media, Culture & Society (2022), 01634437221140531.
[3]
Carolina Are and Pam Briggs. 2023. The emotional and financial impact of de-platforming on creators at the margins. Social Media+ Society 9, 1 (2023), 20563051231155103.
[4]
Steven Asarch. 2020. Reddit Subs Go Private or Silent to Protest Hate Speech. (2020). https://www.newsweek.com/reddit-subs-go-private-silent-protest-hate-speech-1508508
[5]
Blake E Ashforth and Peter H Reingen. 2014. Functions of dysfunction: Managing the dynamics of an organizational duality in a natural food cooperative. Administrative Science Quarterly 59, 3 (2014), 474--516.
[6]
Nancy K Baym. 2015. Connect with your audience! The relational labor of connection. The communication review 18, 1 (2015), 14--22.
[7]
Isaiah Berlin. 2017. Two concepts of liberty. In The liberty reader. Routledge, 33--57.
[8]
Tiago Bianchi. 2022. Reddit.com desktop traffic share 2022. https://www.statista.com/statistics/325144/reddit-global-active-user-distribution/
[9]
Eduardo Bonilla-Silva. 2006. Racism without racists: Color-blind racism and the persistence of racial inequality in the United States. Rowman & Littlefield Publishers.
[10]
Annatjie Botes. 2000. A comparison between the ethics of justice and the ethics of care. Journal of advanced nursing 32, 5 (2000), 1071--1075.
[11]
Geoffrey C Bowker and Susan Leigh Star. 2000. Sorting things out: Classification and its consequences. MIT press.
[12]
Johannes Breit. 2018. How One of the Internet's Biggest History Forums Deals With Holocaust Deniers. https://slate.com/technology/2018/07/the-askhistorians-subreddit-banned-holocaust-deniers-and-facebook-should-too.html
[13]
André Brock Jr. 2020. Distributed blackness. In Distributed Blackness. New York University Press.
[14]
Joy Buolamwini and Timnit Gebru. 2018. Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency. PMLR, 77--91.
[15]
Eshwar Chandrasekharan, Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2022. Quarantined! Examining the effects of a community-wide moderation intervention on Reddit. ACM Transactions on Computer-Human Interaction (TOCHI) 29, 4 (2022), 1--26.
[16]
Eshwar Chandrasekharan, Umashanthi Pavalanathan, Anirudh Srinivasan, Adam Glynn, Jacob Eisenstein, and Eric Gilbert. 2017. You can't stay here: The efficacy of reddit's 2015 ban examined through hate speech. Proceedings of the ACM on Human-Computer Interaction 1, CSCW (2017), 1--22.
[17]
Jonathan Chang and Cristian Danescu-Niculescu-Mizil. 2019. Trajectories of blocked community members: Redemption, recidivism and departure. In The world wide web conference. 184--195.
[18]
Kathy Charmaz. 2006. Constructing grounded theory: A practical guide through qualitative analysis. sage.
[19]
Adrian Chen. 2012. Reddit CEO speaks out on violentacrez in leaked memo:?We stand for free speech'. (2012). https://www.gawker.com/5952349/reddit-ceo-speaks-out-on-violentacrez-in-leaked-memo-we-stand-for-free-speech
[20]
Aymar Jean Christian, Faithe Day, Mark Díaz, and Chelsea Peterson-Salahuddin. 2020. Platforming intersectionality: Networked solidarity and the limits of corporate social Media. Social Media+ Society 6, 3 (2020), 2056305120933301.
[21]
Patricia Hill Collins. 1990. Black feminist thought in the matrix of domination. Black feminist thought: Knowledge, consciousness, and the politics of empowerment 138, 1990 (1990), 221--238.
[22]
Juliet Corbin and Anselm Strauss. 2014. Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage publications.
[23]
Sasha Costanza-Chock. 2020. Design justice: Community-led practices to build the worlds we need. The MIT Press.
[24]
Kimberlé Crenshaw. 2018. Demarginalizing the intersection of race and sex: A Black feminist critique of antidiscrimination doctrine, feminist theory, and antiracist politics [1989]. In Feminist legal theory. Routledge, 57--80.
[25]
Dipto Das, Carsten Østerlund, and Bryan Semaan. 2021. " Jol" or" Pani"?: How Does Governance Shape a Platform's Identity? Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (2021), 1--25.
[26]
Michael Ann DeVito. 2022. How Transfeminine TikTok Creators Navigate the Algorithmic Trap of Visibility Via Folk Theorization. Proceedings of the ACM on Human-Computer Interaction (2022).
[27]
Bryan Dosono and Bryan Semaan. 2019. Moderation practices as emotional labor in sustaining online communities: The case of AAPI identity work on Reddit. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1--13.
[28]
Bryan Dosono and Bryan Semaan. 2020. Decolonizing tactics as collective resilience: Identity work of AAPI communities on Reddit. Proceedings of the ACM on Human-Computer interaction 4, CSCW1 (2020), 1--20.
[29]
Carolyn Fluehr-Lobban. 2008. Collaborative anthropology as twenty-first-century ethical anthropology. Collaborative anthropologies 1, 1 (2008), 175--182.
[30]
Nancy Folbre. 1995. " Holding hands at midnight": The paradox of caring labor. Feminist economics 1, 1 (1995), 73--92.
[31]
Jo Freeman. 1972. The tyranny of structurelessness. Berkeley Journal of Sociology (1972), 151--164.
[32]
R Stuart Geiger. 2016. Bot-based collective blocklists in Twitter: the counterpublic moderation of harassment in a networked public space. Information, Communication & Society 19, 6 (2016), 787--803.
[33]
Sarah A Gilbert. 2020. " I run the world's largest historical outreach project and it's on a cesspool of a website." Moderating a Public Scholarship Site on Reddit: A Case Study of r/AskHistorians. Proceedings of the ACM on Human-Computer Interaction 4, CSCW1 (2020), 1--27.
[34]
Tarleton Gillespie. 2018. Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
[35]
Kishonna L Gray. 2020. Intersectional tech: Black users in digital gaming. LSU Press.
[36]
Kishonna L Gray and Krysten Stein. 2021. "We ?said her name'and got zucked": Black Women Calling-out the Carceral Logics of Digital Platforms. Gender & Society 35, 4 (2021), 538--545.
[37]
Hussam Habib, Maaz Bin Musa, Muhammad Fareed Zaffar, and Rishab Nithyanand. 2022. Are Proactive Interventions for Reddit Communities Feasible?. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 16. 264--274.
[38]
Hussam Habib and Rishab Nithyanand. 2021. Reddit and the Fourth Estate: Exploring the magnitude and effects of media influence on community level moderation on Reddit. arXiv preprint arXiv:2111.00084 (2021).
[39]
Oliver L Haimson, Daniel Delmonaco, Peipei Nie, and Andrea Wegner. 2021. Disproportionate removals and differing content moderation experiences for conservative, transgender, and black social media users: Marginalization and moderation gray areas. Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (2021), 1--35.
[40]
Melissa Hamilton. 2019. The biased algorithm: Evidence of disparate impact on Hispanics. Am. Crim. L. Rev. 56 (2019), 1553.
[41]
Steve Huffman. 2018. https://www.reddit.com/r/announcements/comments/8bb85p/reddits_2017_transparency_report_and_suspect/dx5fl90/
[42]
Steve Huffman. 2020. Remember the Human - Black Lives Matter. https://www.redditinc.com/blog/remember-the-human-black-lives-matter/
[43]
Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019. " Did you suspect the post would be removed?" Understanding user reactions to content removals on Reddit. Proceedings of the ACM on human-computer interaction 3, CSCW (2019), 1--33.
[44]
Shagun Jhaver, Christian Boylston, Diyi Yang, and Amy Bruckman. 2021. Evaluating the effectiveness of deplatforming as a moderation strategy on Twitter. Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (2021 1--30.
[45]
Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2019. Does transparency in moderation really matter? User behavior after content removal explanations on reddit. Proceedings of the ACM on Human-Computer Interaction 3, CSCW (2019), 1--27.
[46]
Shagun Jhaver, Quan Ze Chen, Detlef Knauss, and Amy X Zhang. 2022. Designing Word Filter Tools for Creator-led Comment Moderation. In CHI Conference on Human Factors in Computing Systems. 1--21.
[47]
Jialun Aaron Jiang, Charles Kiene, Skyler Middler, Jed R Brubaker, and Casey Fiesler. 2019. Moderation challenges in voice-based online communities on discord. Proceedings of the ACM on Human-Computer Interaction 3, CSCW (2019), 1--23.
[48]
Charles Kiene, Andrés Monroy-Hernández, and Benjamin Mako Hill. 2016. Surviving an" Eternal September" How an Online Community Managed a Surge of Newcomers. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 1152--1156.
[49]
Shamika Klassen, Sara Kingsley, Kalyn McCall, Joy Weinberg, and Casey Fiesler. 2021. More than a modern day Green book: Exploring the online community of Black Twitter. Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (2021), 1--29.
[50]
Kate Klonick. 2017. The new governors: The people, rules, and processes governing online speech. Harv. L. Rev. 131 (2017), 1598.
[51]
Robert E Kraut and Paul Resnick. 2012. Building successful online communities: Evidence-based social design. Mit Press.
[52]
Cliff Lampe and Paul Resnick. 2004. Slash (dot) and burn: distributed moderation in a large online conversation space. In Proceedings of the SIGCHI conference on Human factors in computing systems. 543--550.
[53]
Cliff Lampe, Paul Zube, Jusil Lee, Chul Hyun Park, and Erik Johnston. 2014. Crowdsourcing civility: A natural experiment examining the effects of distributed moderation in online forums. Government Information Quarterly 31, 2 (2014), 317--326.
[54]
Luke Eric Lassiter and Luke Eric. 2005. The Chicago guide to collaborative ethnography. University of Chicago Press.
[55]
Hanlin Li, Brent Hecht, and Stevie Chancellor. 2022. All That's Happening behind the Scenes: Putting the Spotlight on Volunteer Moderator Labor in Reddit. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 16. 584--595.
[56]
Hanlin Li, Brent Hecht, and Stevie Chancellor. 2022. Measuring the Monetary Value of Online Volunteer Work. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 16. 596--606.
[57]
Claudia Claudia Wai Yu Lo. 2018. When all you have is a banhammer: the social and communicative work of Volunteer moderators. Master's thesis. Massachusetts Institute of Technology.
[58]
Brandeis Marshall. 2021. Algorithmic misogynoir in content moderation practice. Heinrich-Böll-Stiftung European Union (2021).
[59]
Alice E Marwick. 2021. Morally motivated networked harassment as normative reinforcement. Social Media+ Society 7, 2 (2021), 20563051211021378.
[60]
Adrienne Massanari. 2017. #Gamergate and The Fappening: How Reddit's algorithm, governance, and culture support toxic technocultures. New Media & Society 19, 3 (2017), 329--346.
[61]
J Nathan Matias. 2019. The civic labor of volunteer moderators online. Social Media+ Society 5, 2 (2019), 2056305119836778.
[62]
J Nathan Matias. 2019. Preventing harassment and increasing group participation through social norms in 2,190 online science discussions. Proceedings of the National Academy of Sciences 116, 20 (2019), 9785--9789.
[63]
J Nathan Matias, Merry Ember Mou, Jonathon Penney, Maximilian Klein, and Lucas Wright. 2020. Do automated legal threats reduce freedom of expression online? Preliminary results from a natural experiment. https://files.osf.io/v1/resources/nc7e2/providers/osfstorage/5f4e51d40f9a080148ad8741?action=download&direct&version=1
[64]
Richard A Mills. 2018. Pop-up political advocacy communities on reddit.com: SandersForPresident and The Donald. Ai & Society 33, 1 (2018), 39--54.
[65]
Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media & Society 20, 11 (2018), 4366--4383.
[66]
Len Niehoff and E Thomas Sullivan. 2022. Free Speech: From Core Values to Current Debates. Cambridge University Press.
[67]
Manoj Niverthi, Gaurav Verma, and Srijan Kumar. 2022. Characterizing, Detecting, and Predicting Online Ban Evasion. In Proceedings of the ACM Web Conference 2022. 2614--2623.
[68]
Whitney Phillips and Ryan M Milner. 2021. You are here: A field guide for navigating polarized speech, conspiracy theories, and our polluted media landscape. MIT Press.
[69]
Allison J Pugh. 2022. Emotions and the systematization of connective labor. Theory, Culture & Society 39, 5 (2022), 23--42.
[70]
Fraser Raeburn, Lisa Baer-Tsarfati, and Viktoria Porter. 2022. Out of the Ivory Tower, into the Digital World? Democratising Scholarly Exchange. History 107, 375 (2022), 287--301.
[71]
Reddit. 2021. Quarantined Subreddits. https://reddit.zendesk.com/hc/en-us/articles/360043069012-Quarantined-Subreddits
[72]
Manoel Horta Ribeiro, Justin Cheng, and Robert West. 2022. Post Approvals in Online Communities. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 16. 335--346.
[73]
Manoel Horta Ribeiro, Shagun Jhaver, Savvas Zannettou, Jeremy Blackburn, Gianluca Stringhini, Emiliano De Cristofaro, and Robert West. 2021. Do platform migrations compromise content moderation? evidence from r/the_donald and r/incels. Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (2021), 1--24.
[74]
Sarah T Roberts. 2019. Behind the screen. Yale University Press.
[75]
Niloufar Salehi. 2020. Do no harm. Logic Magazine (2020).
[76]
Christian Sandvig, Kevin Hamilton, Karrie Karahalios, and Cedric Langbort. 2016. Automation, algorithms, and politics| when the algorithm itself is a racist: Diagnosing ethical harm in the basic components of software. International Journal of Communication 10 (2016), 19.
[77]
Sarita Schoenebeck and Lindsay Blackwell. 2020. Reimagining social media governance: Harm, accountability, and repair. Yale JL & Tech. 23 (2020), 113.
[78]
Sarita Schoenebeck, Oliver L Haimson, and Lisa Nakamura. 2021. Drawing from justice theories to support targets of online harassment. new media & society 23, 5 (2021), 1278--1300.
[79]
Sarita Schoenebeck, Carol F Scott, Emma Grace Hurley, Tammy Chang, and Ellen Selkie. 2021. Youth Trust in Social Media Companies and Expectations of Justice: Accountability and Repair after Online Harassment. Proceedings of the ACM on Human-Computer Interaction 5, CSCW1 (2021), 1--18.
[80]
Angela M. Schöpke-Gonzalez, Shubham Atreja, Han Na Shin, Najmin Ahmed, and Libby Hemphill. 2022. Why do volunteer content moderators quit? Burnout, conflict, and harmful behaviors. New Media & Society 0, 0 (2022), 14614448221138529. https://doi.org/10.1177/14614448221138529
[81]
Joseph Seering, Geoff Kaufman, and Stevie Chancellor. 2022. Metaphors in moderation. New Media & Society 24, 3 (2022), 621--640.
[82]
Joseph Seering, Tony Wang, Jina Yoon, and Geoff Kaufman. 2019. Moderator engagement and community development in the age of algorithms. New Media & Society 21, 7 (2019), 1417--1443.
[83]
Qinlan Shen and Carolyn P Rosé. 2022. A Tale of Two Subreddits: Measuring the Impacts of Quarantines on Political Engagement on Reddit. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 16. 932--943.
[84]
Linda Tuhiwai Smith. 2021. Decolonizing methodologies: Research and indigenous peoples. Bloomsbury Publishing.
[85]
Tim Squirrell. 2019. Platform dialectics: The relationships between volunteer moderators and end users on reddit. New Media & Society 21, 9 (2019), 1910--1927.
[86]
Kumar Bhargav Srinivasan, Cristian Danescu-Niculescu-Mizil, Lillian Lee, and Chenhao Tan. 2019. Content removal as a moderation strategy: Compliance and other outcomes in the changemyview community. Proceedings of the ACM on Human-Computer Interaction 3, CSCW (2019), 1--21.
[87]
Nicolas P Suzor, Sarah Myers West, Andrew Quodling, and Jillian York. 2019. What do we mean when we talk about transparency? Toward meaningful transparency in commercial content moderation. International Journal of Communication 13 (2019), 18.
[88]
Margaret E Tankard and Elizabeth Levy Paluck. 2016. Norm perception as a vehicle for social change. Social Issues and Policy Review 10, 1 (2016), 181--211.
[89]
Charles Taylor. 2017. What's wrong with negative liberty. In The Liberty Reader. Routledge, 141--162.
[90]
Hibby Thach, Samuel Mayworm, Daniel Delmonaco, and Oliver Haimson. 2022. (In) visible moderation: A digital ethnography of marginalized users and content moderation on Twitch and Reddit. new media & society (2022), 14614448221109804.
[91]
Tom Tyler, Matt Katsaros, Tracey Meares, and Sudhir Venkatesh. 2021. Social media governance: can social media companies motivate voluntary rule following behavior among their users? Journal of experimental criminology 17, 1 (2021), 109--127.
[92]
Donghee Yvette Wohn. 2019. Volunteer moderators in twitch micro communities: How they get involved, the roles they play, and the emotional labor they experience. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1--13.
[93]
Lucas Wright. 2022. Automated Platform Governance Through Visibility and Scale: On the Transformational Power of AutoModerator. Social Media+ Society 8, 1 (2022), 20563051221077020.
[94]
Qunfang Wu, Louisa Kayah Williams, Ellen Simpson, and Bryan Semaan. 2022. Conversations About Crime: Re-Enforcing and Fighting Against Platformed Racism on Reddit. Proceedings of the ACM on Human-Computer Interaction 6, CSCW1 (2022), 1--38.
[95]
Jonathan Zittrain. 2022. How to Fix Twitter and Facebook. https://www.theatlantic.com/ideas/archive/2022/06/elon-musk-twitter-takeover-mark-zuckerberg/661219/

Cited By

View all
  • (2024)Trans-centered moderation: Trans technology creators and centering transness in platform and community governanceProceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency10.1145/3630106.3658909(326-336)Online publication date: 3-Jun-2024

Index Terms

  1. Towards Intersectional Moderation: An Alternative Model of Moderation Built on Care and Power

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Proceedings of the ACM on Human-Computer Interaction
    Proceedings of the ACM on Human-Computer Interaction  Volume 7, Issue CSCW2
    CSCW
    October 2023
    4055 pages
    EISSN:2573-0142
    DOI:10.1145/3626953
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 October 2023
    Published in PACMHCI Volume 7, Issue CSCW2

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Reddit
    2. collaborative ethnography
    3. volunteer moderation

    Qualifiers

    • Research-article

    Funding Sources

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)372
    • Downloads (Last 6 weeks)26
    Reflects downloads up to 24 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Trans-centered moderation: Trans technology creators and centering transness in platform and community governanceProceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency10.1145/3630106.3658909(326-336)Online publication date: 3-Jun-2024

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media