At the Participatory Platform Governance Lab, we investigate how users experience, enact, and challenge the governance of social media platforms. Our goal is to understand the possibilities for political participation in sociotechnical systems and incorporate user perspectives into policy recommendations.

Publications

Aspirational Platform Governance: How Creators Legitimise Content Moderation through Accusations of Bias

While content moderation began as a solution to online abuse, it has increasingly been framed as a source of abuse by a diverse coalition of users, civil society organisations, and politicians concerned with platform bias. The resulting crisis of legitimacy has motivated interest in more participatory forms of governance, yet such approaches are difficult to scale on platforms that lack bounded communities and designated tools to support collective governance. Within this context, we use a high-profile debate surrounding bias and racism in content moderation on YouTube to investigate how creators engage in meta-moderation, the participatory evaluation of moderation decisions and policies. We conceptualise the conversation that plays out across a network of videos and comments as aspirational platform governance, or the desire to influence content moderation without established channels or guarantees of success. Through a content analysis of 115 videos and associated online discourse, we identify overlapping and competing understandings of bias, with key fault lines around demographic categories of gender, race, and geography, as well as genres of production and channel size. We analyse how reaction videos navigate structural factors that inhibit discussions of platform practices and assess the functions of aspirational platform governance, including its counter-intuitive role in legitimising content moderation through the airing of complaints.

Copyright Callouts and the Promise of Creator-driven Platform Governance

Responding to frustrations with the enforcement of copyright on YouTube, some creators publish videos that discuss their experiences, challenge claims of infringement, and critique broader structures of content moderation. Platform callouts, or public complaints about the conduct of or on platforms, are one of the primary ways creators challenge the power imbalance between users and corporations. Through an analysis of 135 videos, we provide a rich empirical account of how creators publicly define the problem of copyright enforcement, propose solutions, and attribute responsibility to other creators, the platform, and external actors like media conglomerates. Creators criticise the prevalence of “false” copyright claims that ignore fair use or serve ulterior motives like harassment, censorship, and financial extortion, as well as the challenges of communicating with the platform. Drawing inspiration from organisational theory, we differentiate horizontal and vertical callouts according to the institutional positioning of the speaker and target. Horizontal callouts, or public complaints between peers, offer a mechanism for community self-policing, while vertical callouts, or public complaints directed towards organisations, provide a mechanism for influencing centralised content moderation policies and practices. We conclude with a discussion of the benefits and limitations of callouts as a strategy of creator-driven platform governance.

Projects

The Marketplace of Algorithms: Early Experiments with Middleware Governance on Bluesky

Middleware, which refers to third-party tools for curation and content moderation, is a policy proposal to address the crisis of trust surrounding social media. It also describes the configuration of platform governance on Bluesky, providing a rich opportunity to investigate the strengths and limitations of the solution. Through an analysis of the design of community-driven platform governance features and their uptake by early adopters, we highlight the potential, limitations, and competing platform imaginaries of governance expressed by the platform’s leadership and users. Employing both quantitative and qualitative methods, we analyze Bluesky’s primary tools for community curation (Feeds, Starter Packs) and moderation (Moderation Lists, Labelers). Bluesky offers mixed lessons for the viability of middleware governance. At a technical level, the design is a success, offering a social media experience that replicates the feel of centralized social media platform while facilitating substantial experimentation with community tools. However, gaps remain between the promise and practice of middleware markets, reflected in competing ideas of what the market is and what end it serves. If it is meant to function as a conventional market, then substantial developments in monetization are necessary to expand middleware’s offerings. If it is meant to function as a mixed market, supported by very different kinds of capital, then the community infrastructure needed to support these endeavors is notoriously difficult to scale up. The marketplace of algorithms is thus not only a policy solution and vision of platform governance, but also a cultural contestation between individual values and communal values.

Priorities and Exclusions within Trust and Safety Industry Standards

Trust and Safety, the field tasked with setting and enforcing standards of acceptable behavior online, has become increasingly professionalized. Yet we know little about the alignment between the profession’s policy priorities and industry practices. To investigate this, we focused on livestreaming which has strong incentives for direct, public-facing policies. Using the schema of content abuse from the Trust & Safety Professional Association, we analyzed the community guidelines of twelve diverse platforms. Our findings reveal significant alignment that is especially pronounced for Twitch and YouTube but also extends to Alt Tech platforms. However, industry standards only partially addressed the policies of adult camming platforms and AfreecaTV, a Korean-based livestreaming service, revealing notable absences in how Trust and Safety imagines the boundaries of its industry. We reflect on the priorities and exclusions emerging industry standards and end with a call for academics and practitioners to broaden the conversation around content moderation. 

Community Notes as Participatory Strategy of Consumer Protection

X — then Twitter — launched Community Notes, a crowd-sourced fact-checking program, in 2021, allowing participants to attach “notes” that contextualize, contest, or clarify posts on the platform. While Community Notes engages in conventional fact-checking tasks of verifying news and political discourse, it also plays an important role drawing attention to spam, scams, fraud, and other consumer protection issues on the platform. Through a combination of qualitative and computational text analysis of consumer protection-oriented Community Notes, we identify the types of consumer protection issues that the program flags, the sources of evidence participants use, and the relationship between the presence of community notes and other top-down content moderation responses (removal of post, removal of account). We then reflect on the potential and limitations of participatory approaches for addressing consumer harms on social media.

 Transactional Orders: How Platforms Structure Payments Between Creators and Fans

Subscription platforms like Patreon or OnlyFans, fundraising platforms like Kickstarter, or donation tools built into video platforms like Twitch or Stripchat reconfigure the relationship between creators, audiences, and platforms. While research has highlighted the impact of new monetization opportunities for creators and fans, the role of the platform has received comparatively less attention and is hindered by a lack of shared terminology, comparative research, and the bracketing off of adult content platforms. We present an integrative framework for conceptualizing monetization on digital platforms, connecting anthropology’s veteran concept of transactional orders to more recent work on platformization. We developed the transactional orders framework through an in-depth investigation of livestreaming and camming platforms. We surveyed the literature to identify relevant features, policies, and concepts related to monetization and conducted empirical research on three livestreaming and three camming platforms to develop platform-agnostic concepts and definitions. The transactional orders framework consists of payment paths, or mechanisms that facilitate the transmission of value between users, and measures of value, or commensurable representations of worth on the platform. We identified three primary payment paths (donations, subscriptions, and purchases) and three primary measures of value (tokens, social metrics, and rankings), as well as seven attributes to assess each component. We illustrate the value of the framework through a discussion of donation mechanisms across platforms.

Legal Lore: The Cultural Transformation of Copyright Law and Policy on YouTube

Since YouTube’s launch, copyright has been at the center of conflict between creators, regulators, the platform, and mass media corporations. Yet the interests of creators are poorly represented in copyright policy, which favors corporate stakeholders. Largely excluded from formal mechanisms of policy participation, creators are left to make sense of complex enforcement systems through copyright gossip and callouts. Although informal strategies of gossip and callouts operate quite differently than the formal mechanisms of governance built into the platform’s infrastructure, the two are inextricably linked. To understand how creators relate to these different governing regimes, we develop the concept of legal lore, defined as cultural understandings of permissible behavior and appropriate redress for wrongs. We investigate how creators and fans understand the appropriate use of intellectual property and copyright reporting tools through a unique dataset of 154 copyright controversies documented on Wikitubia, a fan wiki dedicated to the platform. We identified four primary types of controversies. Interpersonal controversies involve disputes between creators and fans. Economic controversies involve disputes over money, merit, and working conditions. Political controversies involve the suppression of speech, especially criticism. Infrastructural controversies involve automation issues, typically in terms of YouTube’s ContentID system not working correctly.  While legal lore can be more or less proximate to the law, the cultural understandings of copyright expressed on Wikitubia present a relatively autonomous vision of copyright abuse and the appropriate use of platform reporting tools, with only marginal ties to platform policy or copyright law.

The TikTok Caliphate: How Fundamentalist Islamists Exploit and Bypass TikTok’s Algorithm

Islamic terrorist organizations have increasingly turned to social media platforms to spread their propaganda. Examining their presence on TikTok, this study investigates how supporters of ISIS and Al-Qaeda exploit platform features to manipulate recommendation algorithms and evade automated and manual content moderation. Despite TikTok’s policies and national laws, little is known about the specific strategies these groups employ to increase the visibility of their messages within a purportedly hostile platform environment. Our qualitative analysis reveals five key strategies: Audio Camouflage (manipulating sound to evade detection), Meme Infiltration (embedding extremist content within pop culture references), Blurred Intent (masking visuals through blurring or digital distortion), Emoji Codes (using coded language and symbols to bypass moderation), and Bait-and-Switch (starting with harmless content before revealing extremist messaging). These methods demonstrate how extremists adapt to platform Community Guidelines while exposing the limitations of TikTok's moderation system and national enforcement mechanisms. The study underscores the urgent need for improved governance, culturally informed moderation practices, and collaborative efforts between platforms, governments, and educational systems to effectively combat online radicalization and extremism.

Team Members

Isabell Knief

Isabell Knief is an MA student at the University of Bonn and a visiting research fellow at the Hebrew University of Jerusalem. She examines how digital platforms (re)produce power relations in labor, creating new opportunities and vulnerabilities for workers, as well as posing novel regulatory challenges. Her master's thesis examines informal counter-practices that webcam models use to influence the algorithmic work environment and assert their interests.

CJ Reynolds

CJ Reynolds is a PhD Candidate in the Department of Communication and Journalism at the Hebrew University of Jerusalem. CJ researches the role of institutional mistrust in state and platform contexts, and the development of counterpower tactics to push for transparency and accountability from institutions.

Omer Rothenstein

Omer Rothenstein is an MA student in the Department of Communication and Journalism at the Hebrew University of Jerusalem. As a Bachelor of Computer Science and Communication and Journalism, he studies how technology and society converge and coalesce, with a focus on digital culture and internet platforms.

Dana Theiler

Dana Theiler is a dual BA student in Communication and Philosophy at the Hebrew University of Jerusalem. With experience as a marketing manager working with various social media platforms (TikTok, Instagram, Facebook, and more), she examines the power of social media for self-promotion, cross-platform promotional strategies, and the power relations between platforms and users.

Noa Niv

Noa Niv is an MA student in the Department of Communication and Journalism at the Hebrew University of Jerusalem. With a bachelor’s degree in Communication and Journalism and East Asian Studies, she explores cross-cultural interactions on social media, focusing on the dynamics between Western and East Asian individuals in the context of popular culture and online fandom.

Yehonatan Kuperberg

Yehonatan Kuperberg (Kuper) is an MA student in the Department of Communication and Journalism at the Hebrew University of Jerusalem. He holds a bachelor's degree in Communication & Journalism and Political Science, along with personal experience in TV and news production. He explores the relationship between traditional producers and their covered agents or viewers and how they perceive television text or media production considerations.

Gilad Karo

Gilad Karo is an MA student in Communications & Journalism, specializing in Internet and New Media at the Hebrew University of Jerusalem, holding a dual BA in Communications and International Relations. As a research assistant in multiple projects, Gilad explores the intersection of politics, online radicalization, and international relations in the media. Gilad's work examines platform governance, the influence of digital spaces on global political dynamics, and the evolving challenges of regulating online discourse. With experience in both research and policy, Gilad has interned at the Knesset and the INSS, applying theoretical academic research with real-world policy applications.