skip to main content
research-article
Open access

"What are you doing, TikTok?" : How Marginalized Social Media Users Perceive, Theorize, and "Prove" Shadowbanning

Published: 26 April 2024 Publication History

Abstract

Shadowbanning is a unique content moderation strategy receiving recent media attention for the ways it impacts marginalized social media users and communities. Social media companies often deny this content moderation practice despite user experiences online. In this paper, we use qualitative surveys and interviews to understand how marginalized social media users make sense of shadowbanning, develop folk theories about shadowbanning, and attempt to prove its occurrence. We find that marginalized social media users collaboratively develop and test algorithmic folk theories to make sense of their unclear experiences with shadowbanning. Participants reported direct consequences of shadowbanning, including frustration, decreased engagement, the inability to post specific content, and potential financial implications. They reported holding negative perceptions of platforms where they experienced shadowbanning, sometimes attributing their shadowbans to platforms' deliberate suppression of marginalized users' content. Some marginalized social media users acted on their theories by adapting their social media behavior to avoid potential shadowbans. We contributecollaborative algorithm investigation : a new concept describing social media users' strategies of collaboratively developing and testing algorithmic folk theories. Finally, we present design and policy recommendations for addressing shadowbanning and its potential harms.

References

[1]
Iretiolu Akinrinade and Joan Mukogosi. 2021. Strategic Knowledge. https://points.datasociety.net/strategic-knowledge-6bbddb3f0259
[2]
Paula Akpan. 2020. What Is Shadow Banning & Why Does It Deserve Our Attention? https://www.bustle.com/life/what-is-shadow-banning-how-does-it-work
[3]
Travis M. Andrews. 2020. Tinder, TikTok and more: Online activists are finding creative new ways to say Black Lives Matter. Washington Post (June 2020). https://www.washingtonpost.com/technology/2020/06/12/tiktok-tinder-twitter-bts-black-lives-matter/
[4]
Carolina Are. 2019a. Instagram Apologises To Pole Dancers About The Shadowban. https://bloggeronpole.com/2019/07/instagram-apologises-to-pole-dancers-about-the-shadowban/
[5]
Caroline Are. 2019b. Instagram Denies Censorship of Pole Dancers and Sex Workers. https://bloggeronpole.com/2019/07/instagram-denies-censorship-of-pole-dancers-and-sex-workers/
[6]
Carolina Are. 2020. How Instagram's algorithm is censoring women and vulnerable users but helping online abusers. Feminist Media Studies, Vol. 20, 5 (July 2020), 741--744. https://doi.org/10.1080/14680777.2020.1783805 Publisher: Routledge _eprint: https://doi.org/10.1080/14680777.2020.1783805.
[7]
Carolina Are. 2021a. Interview with social media platform Lips. http://bloggeronpole.com/2021/10/interview-with-social-media-platform-lips/
[8]
Carolina Are. 2021b. The Shadowban Cycle: an autoethnography of pole dancing, nudity and censorship on Instagram. Feminist Media Studies, Vol. 0, 0 (May 2021), 1--18. https://doi.org/10.1080/14680777.2021.1928259 Publisher: Routledge _eprint: https://doi.org/10.1080/14680777.2021.1928259.
[9]
Carolina Are. 2022. An autoethnography of automated powerlessness: lacking platform affordances in Instagram and TikTok account deletions. Media, Culture & Society (Dec. 2022), 01634437221140531. https://doi.org/10.1177/01634437221140531 Publisher: SAGE Publications Ltd.
[10]
Carolina Are and Susanna Paasonen. 2021. Sex in the shadows of celebrity. Porn Studies, Vol. 0, 0 (Sept. 2021), 1--9. https://doi.org/10.1080/23268743.2021.1974311 Publisher: Routledge _eprint: https://doi.org/10.1080/23268743.2021.1974311.
[11]
Ellissa Bain. 2020. TikTok: How to get ?un-shadowbanned' -- temporary ban can be reversed! https://www.hitc.com/en-gb/2020/07/01/tiktok-how-to-get-un-shadowbanned-temporary-ban-can-be-reversed/ Section: Trending.
[12]
Anna Veronica Banchik. 2021. Disappearing acts: Content moderation and emergent practices to preserve at-risk human rights--related content. New Media & Society, Vol. 23, 6 (June 2021), 1527--1544. https://doi.org/10.1177/1461444820912724 Publisher: SAGE Publications.
[13]
Paul M. Barrett and J. Grant Sims. 2021. False Accusation: The Unfounded Claim that Social Media Companies Censor Conservatives. Technical Report. NYU Stern Center for Business and Human Rights. https://static1.squarespace.com/static/5b6df958f8370af3217d4178/t/6011e68dec2c7013d3caf3cb/1611785871154/NYUFalseAccusationreport_FINAL.pdf
[14]
Sophie Bishop. 2019. Managing visibility on YouTube through algorithmic gossip. New Media & Society, Vol. 21, 11--12 (Nov. 2019), 2589--2606. https://doi.org/10.1177/1461444819854731 Publisher: SAGE Publications.
[15]
Danielle Blunt and Ariel Wolf. 2020. Erased: The Impact of FOSTA-SESTA & the Removal of Backpage. Technical Report. Hacking//Hustling.
[16]
Danielle Blunt, Ariel Wolf, Emily Coombes, and Shanelle Mullin. 2020. Posting Into the Void: Studying the Impact of Shadowbanning on Sex Workers and Activists. https://hackinghustling.org/posting-into-the-void-content-moderation/
[17]
Elena Botella. 2019. TikTok Admits It Suppressed Videos by Disabled, Queer, and Fat Creators. Slate (Dec. 2019). https://slate.com/technology/2019/12/tiktok-disabled-users-videos-suppressed.html
[18]
Annie Brown. 2021. Understanding The Technical And Societal Relationship Between Shadowbanning And Algorithmic Bias. https://www.forbes.com/sites/anniebrown/2021/10/27/understanding-the-technical-and-societal-relationship-between-shadowbanning-and-algorithmic-bias/ Section: AI.
[19]
Taina Bucher. 2017. The algorithmic imaginary: exploring the ordinary affects of Facebook algorithms. Information, Communication & Society, Vol. 20, 1 (Jan. 2017), 30--44. https://doi.org/10.1080/1369118X.2016.1154086
[20]
Cody Buntain, Richard Bonneau, Jonathan Nagler, and Joshua A. Tucker. 2021. YouTube Recommendations and Effects on Sharing Across Online Social Platforms. Proceedings of the ACM on Human-Computer Interaction, Vol. 5, CSCW1 (April 2021), 1--26. https://doi.org/10.1145/3449085
[21]
Robyn Caplan and Tarleton Gillespie. 2020. Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy. Social Media Society, Vol. 6, 2 (April 2020), 2056305120936636. https://doi.org/10.1177/2056305120936636 Publisher: SAGE Publications Ltd.
[22]
Cardi B. 2021. Twitter release me from twitter shadowban!!! I won't talk about suckin and fuckin nomore. I promise that was 2020 behavior! https://twitter.com/iamcardib/status/1345977698368200704
[23]
Twitter Help Center. [n.d.]. Debunking Twitter myths. https://help.twitter.com/en/using-twitter/debunking-twitter-myths
[24]
Stevie Chancellor, Yannis Kalantidis, Jessica A. Pater, Munmun De Choudhury, and David A. Shamma. 2017. Multimodal Classification of Moderated Online Pro-Eating Disorder Content. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 3213--3226. https://doi.org/10.1145/3025453.3025985 event-place: Denver, Colorado, USA.
[25]
Stevie Chancellor, Zhiyuan (Jerry) Lin, and Munmun De Choudhury. 2016. "This Post Will Just Get Taken Down": Characterizing Removed Pro-Eating Disorder Social Media Content. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 1157--1162. https://doi.org/10.1145/2858036.2858248 event-place: San Jose, California, USA.
[26]
CNBC Television. 2018. Twitter CEOJack Dorsey Testifies - Sept. 5, 2018. https://www.youtube.com/watch?v=41P9cbaWiBc
[27]
Jennifer Cobbe. 2020. Algorithmic Censorship by Social Platforms: Power and Resistance. Philosophy & Technology (Oct. 2020). https://doi.org/10.1007/s13347-020-00429-0
[28]
Samantha Cole. 2018. Where Did the Concept of 'Shadow Banning' Come From? https://www.vice.com/en/article/a3q744/where-did-shadow-banning-come-from-trump-republicans-shadowbanned
[29]
Charlotte Colombo. 2021. TikTok has apologized for a 'significant error' after a video that suggested racial bias in its algorithm went viral. https://www.insider.com/tiktok-racism-algorithm-apology-creator-marketplace-ziggy-tyler-2021--7
[30]
Jesselyn Cook. 2019a. Instagram's Shadow Ban On Vaguely ?Inappropriate' Content Is Plainly Sexist. https://www.huffpost.com/entry/instagram-shadow-ban-sexist_n_5cc72935e4b0537911491a4f Section: Women.
[31]
Jesselyn Cook. 2019b. Women Are Pretending To Be Men On Instagram To Avoid Sexist Censorship. https://www.huffpost.com/entry/women-are-pretending-to-be-men-on-instagram-to-avoid-sexist-censorship_n_5dd30f2be4b0263fbc99421e Section: Tech.
[32]
Jesselyn Cook. 2020. Instagram's CEO Says Shadow Banning ?Is Not A Thing. ' That's Not True. https://www.huffpost.com/entry/instagram-shadow-banning-is-real_n_5e555175c5b63b9c9ce434b0 Section: Politics.
[33]
Juliet M. Corbin and Anselm L. Strauss. 2008. Basics of qualitative research techniques and procedures for developing grounded theory 3rd ed. ed.). Sage Publications, Inc., Los Angeles, Calif.
[34]
Kelley Cotter. 2021. ?Shadowbanning is not a thing": black box gaslighting and the power to independently know and credibly critique algorithms. Information, Communication & Society, Vol. 0, 0 (Oct. 2021), 1--18. https://doi.org/10.1080/1369118X.2021.1994624 Publisher: Routledge _eprint: https://doi.org/10.1080/1369118X.2021.1994624.
[35]
Antigone Davis and Amit Bhattacharyya. 2021. How Meta Addresses Bullying and Harassment. https://about.fb.com/news/2021/11/how-meta-addresses-bullying-harassment/
[36]
Britt Dawson. 2020. Instagram's problem with sex workers is nothing new. https://www.dazeddigital.com/science-tech/article/51515/1/instagram-problem-with-sex-workers-is-nothing-new-censorship Section: Science & Tech.
[37]
Michael Ann DeVito. 2022. How Transfeminine TikTok Creators Navigate the Algorithmic Trap of Visibility Via Folk Theorization. Proceedings of the ACM on Human-Computer Interaction, Vol. 6, CSCW2 (Nov. 2022), 1--31. https://doi.org/10.1145/3555105
[38]
Michael Anne DeVito, Jeremy Birnholtz, Jeffery T. Hancock, Megan French, and Sunny Liu. 2018a. How People Form Folk Theories of Social Media Feeds and What it Means for How We Study Self-Presentation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, Montreal QC Canada, 1--12. https://doi.org/10.1145/3173574.3173694
[39]
Michael Anne DeVito, Darren Gergle, and Jeremy Birnholtz. 2017. "Algorithms ruin everything": #RIPTwitter, Folk Theories, and Resistance to Algorithmic Change in Social Media. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, Denver Colorado USA, 3163--3174. https://doi.org/10.1145/3025453.3025659
[40]
Michael Anne DeVito, Jeffrey T. Hancock, Megan French, Jeremy Birnholtz, Judd Antin, Karrie Karahalios, Stephanie Tong, and Irina Shklovski. 2018b. The Algorithm and the User: How Can HCI Use Lay Understandings of Algorithmic Systems?. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, Montreal QC Canada, 1--6. https://doi.org/10.1145/3170427.3186320
[41]
Don't Delete Art. 2021. Resource Center. https://dontdelete.art/resource-center/
[42]
Brooke Erin Duffy and Colten Meisner. 2022. Platform governance at the margins: Social media creators' experiences with algorithmic (in)visibility. Media, Culture & Society (July 2022), 01634437221111923. https://doi.org/10.1177/01634437221111923 Publisher: SAGE Publications Ltd.
[43]
Motahhare Eslami, Karrie Karahalios, Christian Sandvig, Kristen Vaccaro, Aimee Rickman, Kevin Hamilton, and Alex Kirlik. 2016. First I "like" it, then I hide it: Folk Theories of Social Feeds. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, San Jose California USA, 2371--2382. https://doi.org/10.1145/2858036.2858494
[44]
Motahhare Eslami, Aimee Rickman, Kristen Vaccaro, Amirhossein Aleyasen, Andy Vuong, Karrie Karahalios, Kevin Hamilton, and Christian Sandvig. 2015. "I always assumed that I wasn't really that close to [her]": Reasoning about Invisible Algorithms in News Feeds. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, Seoul Republic of Korea, 153--162. https://doi.org/10.1145/2702123.2702556
[45]
Nancy Ettlinger. 2018. Algorithmic affordances for productive resistance. Big Data & Society, Vol. 5, 1 (Jan. 2018), 2053951718771399. https://doi.org/10.1177/2053951718771399 Publisher: SAGE Publications Ltd.
[46]
Brian Feldman. 2018. Twitter Is Not ?Shadow Banning' Republicans. https://nymag.com/intelligencer/2018/07/twitter-is-not-shadow-banning-republicans.html
[47]
Jessica L. Feuston, Alex S. Taylor, and Anne Marie Piper. 2020. Conformity of Eating Disorders through Content Moderation. Proceedings of the ACM on Human-Computer Interaction, Vol. 4, CSCW1 (May 2020), 40:1--40:28. https://doi.org/10.1145/3392845
[48]
Juniper Fitzgerald and Jessie Sage. 2019. Shadowbans: Secret Policies Depriving Sex Workers of Income and Community. https://titsandsass.com/shadowbans-secret-policies-depriving-sex-workers-of-income-and-community/
[49]
Instagram for Business. 2017. We understand users have experienced issues with our hastag search that c aused posts to not be surfaced. We are continuously working on improvements to our system with the resources available. https://www.facebook.com/instagramforbusiness/posts/1046447858817451
[50]
Caroline Forsey. 2021. Instagram Shadowban Is Real: How to Test for & Prevent It. https://blog.hubspot.com/marketing/instagram-shadowban
[51]
Chris Fox. 2020. TikTok admits restricting some LGBT hashtags. BBC News (Sept. 2020). https://www.bbc.com/news/technology-54102575
[52]
Vijaya Gadde and Kayvon Beykpour. 2018. Setting the record straight on shadow banning. https://blog.twitter.com/en_us/topics/company/2018/Setting-the-record-straight-on-shadow-banning
[53]
Meira Gebel. 2020. Black Creators Say TikTok Still Secretly Hides Their Content. https://www.digitaltrends.com/social-media/black-creators-claim-tiktok-still-secretly-blocking-content/
[54]
Susan A. Gelman and Cristine H. Legare. 2011. Concepts and Folk Theories. Annual Review of Anthropology, Vol. 40 (2011), 379--398. https://www.jstor.org/stable/41287739 Publisher: Annual Reviews.
[55]
Ysabel Gerrard. 2018. Beyond the hashtag: Circumventing content moderation on social media. New Media & Society, Vol. 20, 12 (Dec. 2018), 4492--4511. https://doi.org/10.1177/1461444818776611
[56]
Ysabel Gerrard. 2020. Social media content moderation: six opportunities for feminist intervention. Feminist Media Studies, Vol. 20, 5 (July 2020), 748--751. https://doi.org/10.1080/14680777.2020.1783807 Publisher: Routledge _eprint: https://doi.org/10.1080/14680777.2020.1783807.
[57]
Ysabel Gerrard and Helen Thornham. 2020. Content moderation: Social media's sexist assemblages. New Media & Society, Vol. 22, 7 (July 2020), 1266--1286. https://doi.org/10.1177/1461444820912540 Publisher: SAGE Publications.
[58]
Tarleton Gillespie. 2018. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press, New Haven.
[59]
Tarleton Gillespie. 2020. Content moderation, AI, and the question of scale:. Big Data & Society (Aug. 2020). https://doi.org/10.1177/2053951720943234 Publisher: SAGE PublicationsSage UK: London, England.
[60]
Tarleton Gillespie. 2022. Do Not Recommend? Reduction as a Form of Content Moderation. Social Media Society, Vol. 8, 3 (July 2022), 20563051221117552. https://doi.org/10.1177/20563051221117552 Publisher: SAGE Publications Ltd.
[61]
Kayla Gogarty and Spencer Silva. 2020. A new study finds that Facebook is not censoring conservatives despite their repeated attacks. https://www.mediamatters.org/facebook/new-study-finds-facebook-not-censoring-conservatives-despite-their-repeated-attacks
[62]
Cristos Goodrow. 2021. On YouTube's recommendation system. https://blog.youtube/inside-youtube/on-youtubes-recommendation-system/
[63]
Robert Gorwa, Reuben Binns, and Christian Katzenbach. 2020. Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data & Society, Vol. 7, 1 (Jan. 2020), 2053951719897945. https://doi.org/10.1177/2053951719897945 Publisher: SAGE Publications Ltd.
[64]
Julia Angwin Grassegger, Hannes. 2017. Facebook's Secret Censorship Rules Protect White Men From Hate Speech But Not Black Children. https://www.propublica.org/article/facebook-hate-speech-censorship-internal-documents-algorithms?token=dQhE1DjH5S-s5oiZGbyX7NoNfXm5VycL
[65]
Keith Grint and Steve Woolgar. 1997. The Machine at Work: Technology, Work and Organization 1st edition ed.). Polity, Cambridge, UK ; Malden, MA : Blackwell Publishers.
[66]
Oliver L. Haimson, Justin Buss, Zu Weinger, Denny L. Starks, Dykee Gorrell, and Briar Sweetbriar Baron. 2020. Trans Time: Safety, Privacy, and Content Warnings on a Transgender-Specific Social Media Site. Proceedings of the ACM on Human-Computer Interaction, Vol. 4, CSCW2 (Oct. 2020), 124:1--124:27. https://doi.org/10.1145/3415195
[67]
Oliver L. Haimson, Avery Dame-Griff, Elias Capello, and Zahari Richter. 2019. Tumblr was a trans technology: the meaning, importance, history, and future of trans technologies. Feminist Media Studies, Vol. 21, 3 (2019), 345--361. https://doi.org/10.1080/14680777.2019.1678505 Publisher: Routledge _eprint: https://doi.org/10.1080/14680777.2019.1678505.
[68]
Oliver L. Haimson, Daniel Delmonaco, Peipei Nie, and Andrea Wegner. 2021a. Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas. Proceedings of the ACM on Human-Computer Interaction, Vol. 5, CSCW2 (Oct. 2021), 466:1--466:35. https://doi.org/10.1145/3479610
[69]
Oliver L. Haimson, Tianxiao Liu, Ben Zefeng Zhang, and Shanley Corvite. 2021b. The Online Authenticity Paradox: What Being "Authentic" on Social Media Means, and Barriers to Achieving It. Proceedings of the ACM on Human-Computer Interaction, Vol. 5, CSCW2 (Oct. 2021), 423:1--423:18. https://doi.org/10.1145/3479567
[70]
Gareth Harris. 2021. Censored? Shadowbanned? Deleted? Here is a guide for artists on social media. https://www.theartnewspaper.com/news/don-t-delete-art-social-media-censorship-guide
[71]
Amelie Heldt. 2020. Borderline speech: caught in a free speech limbo? https://policyreview.info/articles/news/borderline-speech-caught-free-speech-limbo/1510
[72]
Alex Hern. 2019. TikTok's local moderation guidelines ban pro-LGBT content. The Guardian (Sept. 2019). https://www.theguardian.com/technology/2019/sep/26/tiktoks-local-moderation-guidelines-ban-pro-lgbt-content
[73]
Monica Horten. 2021. Algorithms Patrolling Content: Where's the Harm? SSRN Scholarly Paper ID 3792097. Social Science Research Network, Rochester, NY. https://doi.org/10.2139/ssrn.3792097
[74]
Instagram. 2021. Why are certain posts on Instagram not appearing in Explore and hashtag pages? textbar Instagram Help Center. https://help.instagram.com/613868662393739
[75]
Instagram Comms. 2021. We know that some people are experiencing issues uploading and viewing stories. This is a widespread global technical issue not related to any particular topic and we're fixing it right now. We'll provide an update as soon as we can. https://twitter.com/InstagramComms/status/1390376354332487681
[76]
Kokil Jaidka, Subhayan Mukerjee, and Yphtach Lelkes. 2023. Silenced on social media: the gatekeeping functions of shadowbans in the American Twitterverse. Journal of Communication (Jan. 2023), jqac050. https://doi.org/10.1093/joc/jqac050
[77]
Cal Jeffrey. 2019. USPTO grants Facebook patent for automated shadow-banning system. https://www.techspot.com/news/80979-uspto-grants-facebook-patent-automated-shadow-banning-system.html
[78]
Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019a. "Did You Suspect the Post Would be Removed?": Understanding User Reactions to Content Removals on Reddit. Proceedings of the ACM on Human-Computer Interaction, Vol. 3, CSCW (Nov. 2019), 192:1--192:33. https://doi.org/10.1145/3359294
[79]
Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2019b. Does Transparency in Moderation Really Matter?: User Behavior After Content Removal Explanations on Reddit. Proceedings of the ACM on Human-Computer Interaction, Vol. 3, CSCW (Nov. 2019), 1--27. https://doi.org/10.1145/3359252
[80]
Shagun Jhaver, Sucheta Ghoshal, Amy Bruckman, and Eric Gilbert. 2018. Online Harassment and Content Moderation: The Case of Blocklists. ACM Trans. Comput.-Hum. Interact., Vol. 25, 2 (March 2018), 12:1--12:33. https://doi.org/10.1145/3185593
[81]
Jialun Aaron Jiang, Peipei Nie, Jed R. Brubaker, and Casey Fiesler. 2022. A Trade-off-centered Framework of Content Moderation. (June 2022). https://doi.org/10.1145/3534929 arXiv:2206.03450 [cs].
[82]
Nadia Karizat, Dan Delmonaco, Motahhare Eslami, and Nazanin Andalibi. 2021. Algorithmic Folk Theories and Identity: How TikTok Users Co-Produce Knowledge of Identity and Engage in Algorithmic Resistance. Proceedings of the ACM on Human-Computer Interaction, Vol. 5, CSCW2 (Oct. 2021), 305:1--305:44. https://doi.org/10.1145/3476046
[83]
Willett Kempton. 1986. Two Theories of Home Heat Control*. Cognitive Science, Vol. 10, 1 (1986), 75--90. https://doi.org/10.1207/s15516709cog1001_3 _eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1207/s15516709cog1001_3.
[84]
David Klepper, Barbara Ortutay, and Matt O'Brien. 2022. EXPLAINER: How Elon Musk is changing what you see on Twitter. https://apnews.com/article/elon-musk-twitter-inc-technology-europe-business-1b3d4266c5acdab47fc1c95fe8026590
[85]
Chris Köver. 2019. Discrimination - TikTok curbed reach for people with disabilities. https://netzpolitik.org/2019/discrimination-tiktok-curbed-reach-for-people-with-disabilities/ Library Catalog: netzpolitik.org.
[86]
Erwan Le Merrer, Benoît Morgan, and Gilles Trédan. 2021. Setting the Record Straighter on Shadow Banning. In IEEE INFOCOM 2021 - IEEE Conference on Computer Communications. 1--10. https://doi.org/10.1109/INFOCOM42981.2021.9488792 ISSN: 2641--9874.
[87]
Amanda Lenhart and Kellie Owens. 2021. The Unseen Teen. https://datasociety.net/library/the-unseen-teen/ Publisher: Data & Society Research Institute.
[88]
Jackie Lerm. 2020. I asked @mosseri this question, knowing full well how he was going to respond. There you have it guys. Again. Shadowbanning is not a thing. #SMSpouses https://t.co/LXGzGDjpZH. https://twitter.com/jackielerm/status/1231122961379340289
[89]
Thomas WL MacDonald. 2021. ?How it actually works": Algorithmic lore videos as market devices. New Media & Society (June 2021), 14614448211021404. https://doi.org/10.1177/14614448211021404 Publisher: SAGE Publications.
[90]
Brandeis Marshall. 2021. Algorithmic misogynoir in content moderation practice. Technical Report. Heinrich-Böll-Stiftung. 17 pages.
[91]
Ariadna Matamoros-Fernández. 2017. Platformed racism: the mediation and circulation of an Australian race-based controversy on Twitter, Facebook and YouTube. Information, Communication & Society, Vol. 20, 6 (June 2017), 930--946. https://doi.org/10.1080/1369118X.2017.1293130 Publisher: Routledge _eprint: https://doi.org/10.1080/1369118X.2017.1293130.
[92]
Stacey McLachlan. 2021. Experiment: I Tried to Get Shadowbanned on Instagram. https://blog.hootsuite.com/experiment-i-tried-to-get-shadowbanned-on-instagram/
[93]
Bryan Menegus. 2019. Facebook Patents Shadowbanning. https://gizmodo.com/facebook-patents-shadowbanning-1836411346
[94]
Anna Merlan. 2020. How Shadowbanning Went from a Conspiracy Theory to a Selling Point. https://www.vice.com/en/article/v7gq4x/how-shadowbanning-went-from-a-conspiracy-theory-to-a-selling-point-v27n3
[95]
Danaë Metaxa, Joon Sung Park, James A. Landay, and Jeff Hancock. 2019. Search Media and Elections: A Longitudinal Investigation of Political Search Results. Proceedings of the ACM on Human-Computer Interaction, Vol. 3, CSCW (Nov. 2019), 129:1--129:17. https://doi.org/10.1145/3359231
[96]
Callie Middlebrook. 2020. The Grey Area: Instagram, Shadowbanning, and the Erasure of Marginalized Communities. SSRN Scholarly Paper ID 3539721. Social Science Research Network, Rochester, NY. https://doi.org/10.2139/ssrn.3539721
[97]
Rachel E. Moran, Izzi Grasso, and Kolina Koltai. 2022. Folk Theories of Avoiding Content Moderation: How Vaccine-Opposed Influencers Amplify Vaccine Opposition on Instagram. Social Media Society, Vol. 8, 4 (Oct. 2022), 20563051221144252. https://doi.org/10.1177/20563051221144252 Publisher: SAGE Publications Ltd.
[98]
Adam Mosseri. 2021. Shedding More Light on How Instagram Works. https://about.instagram.com/blog/announcements/shedding-more-light-on-how-instagram-works
[99]
Adam Mosseri. 2022. Account Status Update. https://www.instagram.com/reel/Cl34K-BAm3P/?utm_source=ig_embed&ig_rid=bdbd2f52--168b-4f5f-809d-d722cc18e1b1
[100]
Elon Musk. 2022. Twitter Account Status/Shadowban Update. https://twitter.com/elonmusk/status/1601042125130371072?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1601042125130371072%7Ctwgr%5Ecd9c40c1d61926f0930c77752976acf1c1bf863c%7Ctwcon%5Es1_&ref_url=https%3A%2F%2Fhypebeast.com%2F2022%2F12%2Felon-musk-twitter-shadowban-account-update
[101]
Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media & Society, Vol. 20, 11 (Nov. 2018), 4366--4383. https://doi.org/10.1177/1461444818773059
[102]
Casey Newton. 2019. The real bias on social networks isn't against conservatives. https://www.theverge.com/interface/2019/4/11/18305407/social-network-conservative-bias-twitter-facebook-ted-cruz
[103]
Gabriel Nicholas. 2022. Shedding Light on Shadowbanning. Technical Report. Center for Democracy & Technology. 52 pages.
[104]
Cathy O'Neil. 2016. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy 1st edition ed.). Crown, New York.
[105]
onlinecensorship.org. [n.d.]. onlinecensorship.org - Submit Your Report. https://onlinecensorship.org/takedowns/new
[106]
Vanessa Pappas and Kudzi Chikumbu. 2020a. A message to our Black community. https://newsroom.tiktok.com/en-us/a-message-to-our-black-community?from=from_parent_docs
[107]
Vanessa Pappas and Kudzi Chikumbu. 2020b. Progress Report: How we're supporting Black communities and promoting diversity and inclusion. https://newsroom.tiktok.com/en-us/progress-report-how-were-supporting-black-communities-and-promoting-diversity-and-inclusion
[108]
Emilee Rader and Rebecca Gray. 2015. Understanding User Beliefs About Algorithmic Curation in the Facebook News Feed. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, Seoul Republic of Korea, 173--182. https://doi.org/10.1145/2702123.2702174
[109]
Jess Rauchberg. 2022. #Shadowbanned: Queer, trans, and disabled creator responses to algorithmic oppression on TikTok. In LGBTQ Digital Cultures: A Global Perspective. Routledge. Google-Books-ID: fYFZEAAAQBAJ.
[110]
Andreas Rekdal. 2021. What Is a Shadowban and Why Does It Matter? textbar Built In. https://builtin.com/marketing/shadowban
[111]
Adi Robertson. 2019. TikTok prevented disabled users' videos from showing up in feeds. https://www.theverge.com/2019/12/2/20991843/tiktok-bytedance-platform-disabled-autism-lgbt-fat-user-algorithm-reach-limit
[112]
Jeremiah Rodriguez. 2019. Instagram apologizes to pole dancers after hiding their posts. https://www.ctvnews.ca/sci-tech/instagram-apologizes-to-pole-dancers-after-hiding-their-posts-1.4537820 Section: Sci-Tech.
[113]
Guy Rosen. 2019. Remove, Reduce, Inform: New Steps to Manage Problematic Content. https://about.fb.com/news/2019/04/remove-reduce-inform-new-steps/
[114]
Riley Runnells. 2020. TikTok Denies Shadow Banning LGBTQ Hashtags. https://www.papermag.com/tiktok-lgbtq-shadow-banning-2647646779.html Section: LGBTQ.
[115]
Koustuv Saha, Sindhu Kiranmai Ernala, Sarmistha Dutta, Eva Sharma, and Munmun De Choudhury. 2020. Understanding Moderation in Online Mental Health Communities. In International Conference on Human-Computer Interaction. 20.
[116]
Salty. 2020. Shadowbanning is a Thing - and It's Hurting Trans and Disabled Advocates. https://saltyworld.net/shadowbanning-is-a-thing-and-its-hurting-trans-and-disabled-advocates/ Library Catalog: saltyworld.net Section: Algorithmic Bias.
[117]
Salty. 2021. Exclusive: An Investigation into Algorithmic Bias in Content Policing on Instagram (PDF download) textbar Salty. https://saltyworld.net/algorithmicbiasreport-2/ Section: #MeToo.
[118]
Joseph Seering, Tony Wang, Jina Yoon, and Geoff Kaufman. 2019. Moderator engagement and community development in the age of algorithms. New Media & Society (Jan. 2019), 1461444818821316. https://doi.org/10.1177/1461444818821316
[119]
Hong Shen, Alicia DeVos, Motahhare Eslami, and Kenneth Holstein. 2021. Everyday Algorithm Auditing: Understanding the Power of Everyday Users in Surfacing Harmful Algorithmic Behaviors. Proceedings of the ACM on Human-Computer Interaction, Vol. 5, CSCW2 (Oct. 2021), 433:1--433:29. https://doi.org/10.1145/3479577
[120]
Ellen Simpson, Andrew Hamann, and Bryan Semaan. 2022. How to Tame: LGBTQ Users' Domestication of TikTok. Proceedings of the ACM on Human-Computer Interaction, Vol. 6, GROUP (Jan. 2022), 22:1--22:27. https://doi.org/10.1145/3492841
[121]
Ellen Simpson and Bryan Semaan. 2021. For You, or For"You"?: Everyday LGBTQ Encounters with TikTok. Proceedings of the ACM on Human-Computer Interaction, Vol. 4, CSCW3 (Jan. 2021), 1--34. https://doi.org/10.1145/3432951
[122]
Shakira Smith, Claire Fitzsimmons, and Oliver L. Haimson. 2021. Exclusive Report: Censorship of Marginalized Communities on Instagram, 2021) textbar Salty. https://saltyworld.net/product/exclusive-report-censorship-of-marginalized-communities-on-instagram-2021-pdf-download/
[123]
Sanna Spi?ák, Elina Pirjatanniemi, Tommi Paalanen, Susanna Paasonen, and Maria Vihlman. 2021. Social Networking Sites' Gag Order: Commercial Content Moderation's Adverse Implications for Fundamental Sexual Rights and Wellbeing. Social Media Society, Vol. 7, 2 (April 2021), 20563051211024962. https://doi.org/10.1177/20563051211024962 Publisher: SAGE Publications Ltd.
[124]
Liam Stack. 2018. What Is a ?Shadow Ban,' and Is Twitter Doing It to Republican Accounts? The New York Times (July 2018). https://www.nytimes.com/2018/07/26/us/politics/twitter-shadowbanning.html
[125]
Lucy Suchman. 2011. Subject objects. Feminist Theory, Vol. 12, 2 (Aug. 2011), 119--145. https://doi.org/10.1177/1464700111404205 00144 Publisher: SAGE Publications.
[126]
Lucy Suchman and Lucy A. Suchman. 2007. Human-Machine Reconfigurations: Plans and Situated Actions. Cambridge University Press.
[127]
Nicolas P. Suzor, Sarah Myers West, Andrew Quodling, and Jillian York. 2019. What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation. International Journal of Communication, Vol. 13, 0 (March 2019), 18. https://ijoc.org/index.php/ijoc/article/view/9736 Number: 0.
[128]
Jeanna Sybert. 2021. The demise of #NSFW: Contested platform governance and Tumblr's 2018 adult content ban. New Media & Society (Feb. 2021), 1461444821996715. https://doi.org/10.1177/1461444821996715 Publisher: SAGE Publications.
[129]
Sam Tabahriti. 2022. Mark Zuckerberg says there is no 'shadow banning' on Facebook but admits there are 'millions of mistakes'. https://www.businessinsider.com/mark-zuckerberg-no-shadow-ban-facebook-but-mistakes-are-made-2022--8
[130]
Terry Tateossian. 2021. How to Fix Your Instagram Shadowban. https://www.entrepreneur.com/article/377057
[131]
TeamYouTube. 2020a. @Herclueless We don't shadowban channels, but it's possible the video was flagged by our systems as potentially violating guidelines. It may not show up in search, etc. before it's reviewed. Since we have limited workforce due to COVID-19, reviews are taking longer: https://t.co/f25cOgmwRV. https://twitter.com/TeamYouTube/status/1319372516398452737
[132]
TeamYouTube. 2020b. @IslaDrummond Thanks for reaching out -- YouTube doesn't shadowban accounts. If you're referring to live chats not working for owners and moderators, we've seen similar reports and are working on a fix. We'll reach back out once we have more info to share. https://twitter.com/TeamYouTube/status/1340466309398720520
[133]
TeamYouTube. 2020c. @Simptress YouTube doesn't shadowban channels. It's possible the video was flagged by our systems as potentially violative & needs to be reviewed first before it shows up in search, etc. Note that reviews are taking longer since we have limited teams due to COVID-19: https://t.co/f25cOgmwRV. https://twitter.com/TeamYouTube/status/1319378407822589952
[134]
Hibby Thach, Samuel Mayworm, Daniel Delmonaco, and Oliver Haimson. 2022. (In)visible moderation: A digital ethnography of marginalized users and content moderation on Twitch and Reddit. New Media & Society (July 2022), 146144482211098. https://doi.org/10.1177/14614448221109804
[135]
TIkTok Newsroom. 2020. How TikTok recommends videos #ForYou. https://newsroom.tiktok.com/en-us/how-tiktok-recommends-videos-for-you
[136]
Benjamin Toff and Rasmus Kleis Nielsen. 2018. ?I Just Google It": Folk Theories of Distributed Discovery. Journal of Communication, Vol. 68, 3 (June 2018), 636--657. https://doi.org/10.1093/joc/jqy009
[137]
Twitter. 2018. People are asking us if we shadow ban. We don't. Read more to get all the facts. https://cards.twitter.com/cards/gsby/60efb. https://twitter.com/Twitter/status/1022658436704731136
[138]
Kristen Vaccaro, Dylan Huang, Motahhare Eslami, Christian Sandvig, Kevin Hamilton, and Karrie Karahalios. 2018. The Illusion of Control: Placebo Effects of Control Settings. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). Association for Computing Machinery, New York, NY, USA, 1--13. https://doi.org/10.1145/3173574.3173590
[139]
Kristen Vaccaro, Christian Sandvig, and Karrie Karahalios. 2020. “At the End of the Day Facebook Does What It Wants”: How Users Experience Contesting Algorithmic Content Moderation. Proc. ACM Hum.-Comput. Interact, Vol. 4, CSCW2 (2020), 22.
[140]
Julian Van Horne. 2020. Shadowbanning is a Thing - and It's Hurting Trans and Disabled Advocates textbar Salty. https://www.saltysweethearts.com/shadowbanning-is-a-thing-and-its-hurting-trans-and-disabled-advocates/ Section: Algorithmic Bias.
[141]
Julia Velkova and Anne Kaun. 2019. Algorithmic resistance: media practices and the politics of repair. Information, Communication & Society (Aug. 2019), 1--18. https://doi.org/10.1080/1369118X.2019.1657162
[142]
waxpancake. 2009. What was the first website to hide troll's activity to everyone but the troll himself? https://ask.metafilter.com/117775/What-was-the-first-website-to-hide-trolls-activity-to-everyone-but-the-troll-himself
[143]
Riley Weeden. 2020. From shadowbanned to center stage: a student-led social media platform for those the algorithms leave behind. https://uofsdmedia.com/from-shadowbanned-to-center-stage-a-student-led-social-media-platform-for-those-the-algorithms-leave-behind/
[144]
Lucas Wright. 2022. Automated Platform Governance Through Visibility and Scale: On the Transformational Power of AutoModerator. Social Media Society, Vol. 8, 1 (Jan. 2022), 20563051221077020. https://doi.org/10.1177/20563051221077020 Publisher: SAGE Publications Ltd.
[145]
Mark Zuckerberg. 2018. A Blueprint for Content Governance and Enforcement. https://www.facebook.com/notes/751449002072082/

Cited By

View all
  • (2024)Exploring (Anti-)Radicalism on TikTok: German Islamic Content Creators between Advocacy and ActivismReligions10.3390/rel1510117215:10(1172)Online publication date: 26-Sep-2024
  • (2024)Information and intersectionality in the digital millenniumEducation for Information10.3233/EFI-240088(1-13)Online publication date: 12-Sep-2024
  • (2024)From Microaggressions to Viral Trends: A Qualitative Examination of Black Emerging Adults’ Experiences With Online Racial DiscriminationEmerging Adulthood10.1177/21676968241298023Online publication date: 1-Nov-2024
  • Show More Cited By

Index Terms

  1. "What are you doing, TikTok?" : How Marginalized Social Media Users Perceive, Theorize, and "Prove" Shadowbanning

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image Proceedings of the ACM on Human-Computer Interaction
      Proceedings of the ACM on Human-Computer Interaction  Volume 8, Issue CSCW1
      CSCW
      April 2024
      6294 pages
      EISSN:2573-0142
      DOI:10.1145/3661497
      Issue’s Table of Contents
      This work is licensed under a Creative Commons Attribution International 4.0 License.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 26 April 2024
      Published in PACMHCI Volume 8, Issue CSCW1

      Check for updates

      Author Tags

      1. algorithmic folks theories
      2. collaborative algorithm investigation
      3. content moderation
      4. marginalization
      5. shadowbanning
      6. social media

      Qualifiers

      • Research-article

      Funding Sources

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)696
      • Downloads (Last 6 weeks)181
      Reflects downloads up to 01 Nov 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Exploring (Anti-)Radicalism on TikTok: German Islamic Content Creators between Advocacy and ActivismReligions10.3390/rel1510117215:10(1172)Online publication date: 26-Sep-2024
      • (2024)Information and intersectionality in the digital millenniumEducation for Information10.3233/EFI-240088(1-13)Online publication date: 12-Sep-2024
      • (2024)From Microaggressions to Viral Trends: A Qualitative Examination of Black Emerging Adults’ Experiences With Online Racial DiscriminationEmerging Adulthood10.1177/21676968241298023Online publication date: 1-Nov-2024
      • (2024)Algorithmic camouflage: Exploring the shadowbans imposed by algorithms to moderate the content of Chinese gay menBig Data & Society10.1177/2053951724129603711:4Online publication date: 3-Nov-2024
      • (2024)Theorizing Self Visibility on Social Media: A Visibility Objects LensACM Transactions on Computer-Human Interaction10.1145/366033731:3(1-28)Online publication date: 30-Aug-2024
      • (2024)Trans-centered moderation: Trans technology creators and centering transness in platform and community governanceProceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency10.1145/3630106.3658909(326-336)Online publication date: 3-Jun-2024
      • (2024)ShadowbanningBusiness & Information Systems Engineering10.1007/s12599-024-00905-3Online publication date: 28-Oct-2024

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Get Access

      Login options

      Full Access

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media