What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation
This article seeks to provide greater specificity to demands for transparency in the commercial content moderation practices of digital platforms. We identify gaps in knowledge through a thematic analysis of 380 survey responses from individuals who have been the subject of content moderation decisions. We argue that meaningful transparency should be understood as a component of a communicative process of accountability (rendering account) to independent stakeholders. We make specific recommendations for platforms to provide people with clear information about decisions that affect them, including what content is moderated, which rule was breached, and a description of the people and automated processes responsible for identifying content and making the decision. Beyond providing more information to individuals about particular decisions, however, we note the major challenge of improving understanding of content moderation at a systems level. General demands for greater transparency should be reframed to focus on enhanced access to large-scale disaggregated data that can enable new methods and collaborations among academia, civil society, and journalists to make these systems more understandable and accountable.