Who should control what gets shared over social media

Recent decisions on regarding one of social media’s well-known users, former US President Donald Trump, have ignited a broad, global debate on Big Tech’s role in controlling content. With the public space having moved online- it’s where we live, learn, work and protest- there’s a lot at stake with content moderation.

IFEX reached out to member organizations from across the globe, including Africa Freedom of Information Centre and asked them to share their thoughts on this increasingly globe issue.

Gilbert Sendugwa, the Executive Director of Africa Freedom of Information Centre had this to share.

Today, online service providers play a tremendous role in facilitating freedom of expression and access to information- in fact, in many African countries, internet has democratized access to information by enabling communities living in remote places or receiving content through controlled channels to access more information. The people and government use various online platforms to communicate.

Online service providers also have a duty to ensure that their platforms are not used to facilitate the violation of human rights hence the enactment of user policies/ regulations. All users of the various online platforms should expect to adhere to the set standards or they risk being blocked, their content being flagged, or being permanently deleted from the platform. But a key issue continues to arise, how and who regulates media platforms/hold them accountable?

Very often these days, we see major platforms flagging, deleting, or blocking user content for allegedly violating user policies. Whether these actions promote or limit freedom of expression and access to information, we need to assess the level of transparency and accountability of these online platforms.

Some online platforms have been accused of applying double-standards to its users, and instead of limiting freedom of expression and access to information. Most recently, former US President Donald Trump was banned from leading social media platforms for “inciting violence” after his supporters attempted to prevent congress from certifying Joe Biden’s victory. In Uganda, Facebook blocked various accounts of government officials and supporters of the ruling party for allegedly spreading false information ahead of what many term a violent election. Questions have also been raised when online platforms appear to work with authoritarian governments to stifle freedom of expression and information of individuals for fear of antagonizing thier business interests in undemocratic countries.

While these may be legitimate, it is crucial to assess the compliance of online platforms to the standards of transparency and accountability in content moderation. The question now is; who has the power to check the power of media platforms?

And because the online service providers structure the global information and communication space, it is important to discuss the transparency regulation requirements of digital platforms, while protecting freedom of expression and not suppress innovation.

The Santa Clara Principles on content moderation call for companies to disclose information about how many posts are removed, notify users about content removal and give them meaningful opportunities to appeal take downs and have content restored. So far, several online platforms like Reddit, Apple, GitHub, Twitter, and YouTube have endorsed and implemented the Santa Clara Principles, among others.

With questions around independence, transparency, accountability and conflicting interests, discussion on better oversight, greater accountability, and how to regain trust between online platforms, governments and the public, is urgently needed.

IFEX is a global network of organizations that promote and defend the right to freedom of expression and information.

Learn more from other IFEX Members on content moderation.

No Comments

Post a Comment