#11928. “Dangerous organizations: Facebook’s content moderation decisions and ethnic visibility in Myanmar”
June 2027 | publication date |
Proposal available till | 28-05-2025 |
4 total number of authors per manuscript | 0 $ |
The title of the journal is available only for the authors who have already paid for |
|
|
Journal’s subject area: |
Sociology and Political Science;
Communication; |
Places in the authors’ list:
1 place - free (for sale)
2 place - free (for sale)
3 place - free (for sale)
4 place - free (for sale)
Abstract:
On February 5th, 20XX Facebook labeled four Ethnic Armed Organizations (EAOs) in Myanmar as “Dangerous Organizations” thereby formally banning them from using the company’s platform. At the time of the company’s announcement, all four of these groups were in open conflict with the Myanmar military (Tatmadaw) who were themselves in the process of being prosecuted for genocide in the International Court of Justice. This study looks to examine this decision and other content moderation decisions involving ethnic speech within Myanmar to document Facebook’s evolution from a tool for democratic liberalization to international political authority. While outwardly projecting a stance of neutrality in foreign affairs, this work seeks to demarcate how Facebook’s content moderation practices have transformed the company into a new governmental apparatus freely adjudicating political speech claims around the globe with virtual impunity. Building on scholarly discussions around content moderation and digital governance in media studies, I look to interrogate how Facebook’s positionality affects ethnic visibility in nations beholden to the company for national and worldwide recognition.
Keywords:
content moderation; digital governance; digital media; elections; Facebook; foreign affairs; genocide; Rohingya
Contacts :