
About
Dr Carolina Are is a digital criminologist researching on the intersection of online harms and freedom of speech, with an emphasis on digital rights in the observation of social media platform governance, artificial intelligence, digital work and marginalised communities.
Her work has been published in high-impact journals such as New Media & Society, Social Media + Society, Information, Communications & Society and Feminist Media Studies and funded by high-profile research funders and NGOs. It has resulted in several citations, in engagement with platforms, tech regulators, charities and governments.
As a researcher-activist and content creator, and as a former public relations professional, Carolina is committed to raising awareness of the issues social media users face, from safety to censorship of marginalised communities, through her research, teaching and public scholarship. Thanks to her combined social media following of over 400,000 and to features in The Guardian, The New York Times, the BBC and Wired, she is a known expert in the field of digital culture, content moderation and online safety amongst colleagues and civil society alike.
Prior to LSE100, Carolina was an Innovation Fellow at Northumbria University’s Centre for Digital Citizens, an EPSRC-funded project tackling the challenges citizens face in an increasingly digital society. She has been a guest lecturer and public speaker on digital rights, platform governance and online harms for several universities and events worldwide, including the University of Amsterdam, Concordia University in Canada, University College Dublin, Durham University, John Cabot University in Rome, the University of Trento, the Feminist Lecture Program, WOW Fest, and more.
Awards
Sexual Freedom Awards 2019 - Activist of the Year
Global Undergraduate Awards 2015 - Overall Winner Media and Journalism
Research
Expertise Details:
Digital criminology; social media; platform governance; online harms; censorship; digital policy; online nudity; content creators.
Articles:
Are, C., Briggs, P., & Brown, R. (2025). Content creators’ hopes and fears about artificial intelligence. Convergence, 0(0). https://doi.org/10.1177/13548565251372830.
Are, C. and Briggs, P. (2025). Post-social media: de-platformed users’ challenges to belong in ‘corpo-civic’ spaces. Convergence, 0(0). https://doi.org/10.1177/13548565251336051
Divon, T., Are, C., & Briggs, P. (2025). Platform gaslighting: A user-centric insight into social media corporate communications of content moderation. Platforms & Society, 2. https://doi.org/10.1177/29768624241303109.
Kojah, S.A., Zhang, B. Z., Are, C., Delmonaco, D. and Haimson, O. L. (2025). "Dialing it Back:" Shadowbanning, Invisible Digital Labor, and how Marginalized Content Creators Attempt to Mitigate the Impacts of Opaque Platform Governance. Proc. ACM Hum.-Comput. Interact. 9, 1, Article GROUP12 (January 2025), 22 pages. https://doi.org/10.1145/3701191.
Wimmer, M., Skelton, F., Webster, T. C., Ullah, Z., Alexander, K., Spencer, E. J., … Collomosse, J. (2025). FalseWebs Network Policy Paper: Understanding and Addressing Misinformation in Scotland. https://doi.org/10.31234/osf.io/nw73u_v1.
Steeds, M., Clinch, S., Are, C., Brown, G., Dalton, B., Webster, L., Wilson, A. & Woolley, D. (2025). Queer Joy on Social Media: Exploring the Expression and Facilitation of Queer Joy in Online Platforms', Paper presented at The 2025 ACM (Association of Computing Machinery) CHI conference on Human Factors in Computing Systems, Yokohama, Japan, 26/04/25 - 1/05/25 pp. 1-28. https://doi.org/10.1145/3706598.3713592.
Are, C. (2024a). Researching under the platform gaze: Rethinking the challenges of platform governance research. Platforms & Society, 1. https://doi.org/10.1177/29768624241283912.
Are, C. (2024b). ‘Dysfunctional’ appeals and failures of algorithmic justice in Instagram and TikTok content moderation. Information, Communication & Society, 1–18. https://doi.org/10.1080/1369118X.2024.2396621.
Are, C. (2024d). Flagging as a silencing tool: Exploring the relationship between de-platforming of sex and online abuse on Instagram and TikTok. New Media & Society, 27(6), 3577-3595. https://doi.org/10.1177/14614448241228544.
Are, C., Talbot, C., & Briggs, P. (2024). Social media affordances of LGBTQIA+ expression and community formation. Convergence, 31(4), 1401-1422. https://doi.org/10.1177/13548565241296628.
Stegeman, H. M., Are, C., & Poell, T. (2024). Strategic Invisibility: How Creators Manage the Risks and Constraints of Online Hyper(In)Visibility. Social Media + Society, 10(2). https://doi.org/10.1177/20563051241244674.
Are, C. (2023). The assemblages of flagging and de-platforming against marginalised content creators. Convergence, 30(2), 922-937. https://doi.org/10.1177/13548565231218629.
Are, C., & Briggs, P. (2023). The Emotional and Financial Impact of De-Platforming on Creators at the Margins. Social Media + Society, 9(1). https://doi.org/10.1177/20563051231155103.
Are, C. (2022) An autoethnography of automated powerlessness: lacking platform affordances in Instagram and TikTok account deletions. Media, Culture & Society, 45(4): 822–840.
Are, C. and Paasonen, S. (2021) ‘Sex in the shadows of celebrity.’ Porn Studies, 8 (4) pp. 411-419.
Are, C. (2021). ‘The Shadowban Cycle: an autoethnography of pole dancing, nudity and censorship on Instagram.’ Feminist Media Studies, 22 (8), pp. 2002-2019.
Are, C. (2020). A corpo-civic space: A notion To address social media’s corporate/civic hybridity. First Monday, 25(6). https://doi.org/10.5210/fm.v25i6.10603
Are, C. (2019). Patterns of media coverage repeated in online abuse on high-profile criminal cases. Journalism, 22(11), 2692–2710. https://doi.org/10.1177/1464884919881274.
Book Chapters:
Are, C. (2025). Pole dancing academic: Decompartmentalizing the personal, sexual, and professional while blending pole dance and research careers. In: A.J. Carr and L. Sally (eds.), Sex on Stage: Performing The Body Politic. London: Bloomsbury. https://www.bloomsbury.com/uk/sex-on-stage-9781350443624/
Are, C. & Gerrard, Y. (2023). Violence and the feminist potential of content moderation. In: K. Boyle and S. Berridge (Eds.), Routledge Companion on Gender, Media and Violence. London; New York: Routledge.
Are, C. (2019) Fire WERK With Me. In Volpert, M. and Kempt, H. (Eds.), Rupaul's Drag Race and Philosophy Sissy That Thought: Popular Culture and Philosophy.
Academic Commentary and Book Reviews Are, C. (2024c). Algorithmic folk theories and peer review: on the importance of valuing participant expertise. Journal of Gender Studies. https://www.tandfonline.com/doi/full/10.1080/09589236.2024.2377630.
Are, C. (2020). How Instagram’s algorithm is censoring women and vulnerable users but helping online abusers. Feminist Media Studies., 20 (5), pp. 741-744.
Are, C. (2019). Book review: John Mair, Tor Clark, Neil Fowler, Raymond Snoddy and Richard Tait (eds) Anti-social media? The impact on journalism and society. Journalism, 20 (4), pp: 633-634.
Teaching
Courses:
Engagement and impact
Press highlights:
Meta shuts down global accounts linked to abortion advice and queer content - The Guardian
How to check if your Instagram posts are being hidden - BBC
Instagram keeps banning sex-positive and kink accounts - Dazed
Instagram Is Removing Sex-Positive Accounts Without Warning - Wired
Shock horror: AI is over-sexualising and censoring women's bodies - Cosmopolitan UK
Ads for AI sex workers are flooding Instagram and TikTok - NBC News
The Taylor Swift deepfake debacle was frustratingly preventable - TechCrunch https://techcrunch.com/2024/01/30/the-taylor-swift-deepfake-debacle-was-frustratingly-preventable/
“A TikTokified Idea Of Beauty” – The Truth About Looksmaxxing, The Extreme Makeover Trend On The Rise - Service95
Meta allowed pornographic ads that break its content moderation rules - The New Scientist
The dark rise of AI influencers: Why beautiful people who don't exist are invading your social media - The Daily Mail
Can AI Solve the Content-Moderation Problem? - The Wall Street Journal
Cringe! How millennials became uncool - The Guardian
Podcast highlights: Free Speech and Online Harm - BBC Antisocial podcast
The impact of shadow banning - TechDirt
Why does Gen Z think Millennials are CRINGE? Smart Cookies podcast
Are credit card companies the biggest force in content moderation? The Observer Sensemaker Podcast
What happens when you lose your social media? BBC TechLife
What happens in 2026? From LinkedIn’s Death to Quantum Meh and Flying Cars, We have it all - Crashed podcast