Africa-Press Ethiopia
We know social media content can lead to violence, but is there a plan to stop it? Facebook has tried to improve its content moderation after it was used as a tool in spreading online incitement of violence in Myanmar. But recent violence in France and Ethiopia — which has drawn attention in the United States Congress — shows that there is work to be done.
Meanwhile YouTube, WhatsApp, Twitter, and TikTok are increasingly facing dangerous posts on their platforms. We need governance solutions for social media, and we need them now. And independent oversight is the only path to real change.
In many parts of the global south, Facebook depends heavily on third-party content flaggers, such as civil society organizations and other local organizations. Take Myanmar as an example. There, Facebook uses CSO third parties to identify dangerous content — which is certainly a more effective method than than its own detection mechanisms in some countries, since CSOs, being staffed with locals who speak local languages and understand the social and political country context deeply, are more suited to point out linguistic and cultural nuances in online posts and comments.
Facebook has made strides in how it consults with civil society: For high-profile events in high-profile countries, like the 2020 Myanmar elections, it holds group consultations involving multiple CSOs.
However, reports suggest Facebook is not always transparent with these CSOs. Perhaps the company would cite potential legal trouble for sharing user information, but a structural examination would show that as a profit-driven company, Facebook has no real transparency obligations to these organizations.
Social media companies should be monitored by a body made up of civil society, multilateral organizations, and researchers, rather than one staffed with those very companies’ picks.
—
Either way, the CSOs that do the heavy lifting on content moderation may not get to find out what happens in content moderation systems. Once they report content to Facebook, it may take some time for Facebook to respond, and in some cases CSOs may never know what action is taken on flagged posts, how such decisions are made, or where this data is stored.