Framework Agreement between Big Tech Corporations, Electoral Bodies, and Media Monitoring Corporations
by Sherylle Dass, Nuhaa Hendricks & Alana Robertson
Introduction
The exponential growth of social media platforms has led to rise in the spread of disinformation among the public. The prevalence of disinformation on online platforms has become an epidemic. Disinformation has detrimental effects on social media users as it can manipulate and shift their world views. Disinformation refers to false or misleading information intentionally spread to cause harm. In the context of elections, it can significantly affect voter participation and influence election outcomes.1 Electoral Commission Chairperson Mosotho Moepya highlighted the importance and urgency of this issue stating, “disinformation can significantly undermine election fairness and credibility. Reliable information is essential for democracies, allowing citizens to make informed choices about their leaders.” The permeation of online disinformation during an election period jeopardizes the most crucial criteria of the democratic process, that is, to be free and fair. Almost every electorate in the world has these platforms at their fingertips and are highly vulnerable to both their advantages and disadvantages. Safeguarding the integrity of national elections must become an international priority as ultimately, it holds the potential to endanger democracy itself. The South African Independent Electoral Commission (IEC) partnered with tech giants Google, Meta and TikTok and, Media Monitoring Africa (MMA), to combat the spread of disinformation online in an agreement named The Framework of Cooperation.
The Framework
The Framework of Cooperation was implemented as a measure to tackle the spread of online The agreement was signed by the IEC, the aforementioned tech giants (excluding X, formerly known as Twitter) and Media Monitoring Africa (MMA). MMA is an independent body which acts as a watchdog over the IEC and major social media platforms. Although the IEC had worked with these platforms previously, this was the first time a Framework of Cooperation was signed between them.2 The fact that Meta, Google and TikTok signed the agreement appeared promising, however, it did not guarantee they would take the agreed-upon measures seriously. The Framework acknowledges disinformation as a danger and risk to democracy and elections with Meta, Google and TikTok agreeing to assist the IEC in its mandate to address intentionally false statements as held in Section 89(2) of the Electoral Act.3 The primary role of the IEC is to ensure that South African elections are free, fair and democratic. The Framework of Cooperation was spearheaded by the IEC to help uphold this criterion. It is recognized by the IEC, and its signatories that, for as long as disinformation, fake news and hate speech are pervasive on platforms such as Facebook, TikTok, Instagram, YouTube and X, voters can easily be misled, deceived, exploited, and manipulated, which can unfairly influence election results. Therefore, the Framework encourages large tech companies to take action by updating their policies to remove harmful content, issue warnings about misleading information, and de-list such content from their platforms.
The Framework encourages its signatories to do the following;
- Cooperate with one another in good faith during the election period,
- Remove content that is false or untrue, actively manipulative or deceitful, and/or incites hatred or violence,
- Provide advisory warnings explaining that not all content shared on the platform has been fully fact-checked.,
- Engage in productive collaboration among other signatories whilst adhering to existing laws and not sharing confidential user data,
- Cooperate with the IEC and MMA through initiatives such as Real411.org and PADRE to curb the spread of disinformation,
- Educate voters online about digital literacy, the harms and dangers of disinformation and how to spot it.
- Support the establishment of a Working Group to promote accurate information, conduct awareness campaigns, and provide training for political parties and candidates on combating disinformation.
Reall411.org and PADRE are initiatives under MMA, borne out of the unprecedented challenges faced by elections today. These initiatives aim to protect the public’s digital freedom through building digital literacy campaigns to warn against the harms of disinformation and provide a space for complaints whereby posts and content on social media can be moderated and reviewed, and if necessary, be forwarded to the concerned platform to be taken down. The Framework allows for an easier collaboration and communication stream between the bodies and corporations involved.
Meta
For the 2024 Elections, META claimed to develop a new approach to preserve fairness and integrity throughout the election process, which would be built on insights gained not only from past South African Elections but from elections worldwide. It would introduce industry-leading transparency tools for adverts related to elections and politics, develop comprehensive policies to prevent election interference and voter fraud, and establish the largest third-party fact-checking program among social media platforms to combat misinformation. Additionally, Meta also launched a South Africa-specific Elections Operations Centre that aimed to monitor and identify potential threats in real time. They claimed to remove the most serious kinds of misinformation posted on Facebook, Instagram, and Threads. For content not violating their policies, they claimed to collaborate with independent fact-checking organizations such as Africa Check and AFP. These organizations review and fact-check content in English, Afrikaans, Zulu, Sotho, and Setswana. Finally, they claim that when content is debunked by these fact-checkers, warning labels are attached to the content and its distribution in the feed is reduced so fewer people are likely to see it. Although outwardly claiming to have been committed to tackling issues of disinformation surrounding the election earlier this year, Meta did not follow through with these approaches to the full extent.
Google and TikTok
Google and TikTok provided relatively vague comments regarding their actions towards the South African election and commented more publicly about elections in general. The Senior Manager of Government Affairs and Public Policy at Google Southern Africa, said, “Google has always been committed to supporting democratic processes, including supporting elections integrity and ensuring trust among voters. We place a big focus on creating products and programs that enable people across the globe to engage with these activities through information that is accurate, protecting elections and campaigns from bad actors, as well as assisting campaigns in managing their digital presence.”4 While the Public Policy and Government Relations Director at TikTok stated “we take the responsibility to protect our community as well as the integrity of our platform particularly around elections with the utmost seriousness… we work hard to keep harmful misinformation and other violations of our policies off our platform.”5 Although their comments are vague, both platforms being signatories of the Framework is undoubtedly a step in the right direction towards combating the ubiquitous presence of disinformation online.
Shortcomings of the Framework
1. Western-centric focus
The Framework focuses on the use of social media in South Africa; however, an issue which falls outside the IEC’s realm of control is how Western-centric social media platforms are. The problem pertains to disinformation, manipulative content and hate speech that is expressed in South African contexts not being effectively moderated by AI content moderation tools employed by these platforms to detect and remove such speech. In collaboration with Global Witness (GW), the Legal Resources Centre (LRC) investigated this issue. The investigation took place in June 2023, the team created xenophobic adverts which were posted to Facebook, YouTube and TikTok. These adverts contained offensive, real-world examples of words and phrases used in South Africa to assess whether they will pass through the platforms content moderation policies. Ten adverts were uploaded in English, all of which were translated into Afrikaans, and nine were translated into Xhosa and Zulu. All adverts were approved for publication, except one that Facebook rejected in both English and Afrikaans, though it was approved in Xhosa and Zulu. Another investigation was done in August 2023 and included testing X; this time the adverts targeted female journalists using South African language and context. This time all of the adverts were approved for publication. Alarmingly, Meta has 15,000 content reviewers who monitor content on Facebook and Instagram in more than 70 languages, yet both their AI systems and human content moderators are ineffective in flagging hate speech in South African languages other than English, nor can they pick up on South African slang or cultural nuances.6
In the lead-up to the 29 May 2024 election, the LRC conducted a third social media content moderation investigation using the GW methodology, focusing on Google (YouTube) and TikTok. The LRC created 16 disinformation posts as 15-second videos and submitted them to both platforms for approval, scheduling them for future publication but deleting them once approved to prevent public release. After 24 hours, the LRC reviewed whether the platforms’ content moderation systems flagged these ads as suitable for the South African public. On YouTube, 15 out of 16 disinformation ads were approved, with only one rejected for unspecified reasons. On TikTok, despite its policy against disinformation, only one ad was rejected, revealing a failure in both platforms’ content moderation processes. This highlights the need for stricter enforcement of policies to prevent the spread of election-related falsehoods.
2. Censorship versus free speech
It is important to consider the contention this agreement and organizations such as MMA face, content moderation versus free speech. Notably, X has not signed the Framework, where the line between free speech and disinformation is often blurred. This decision was likely made due to their CEO, Elon Musk, being a staunch advocate for limited censorship. Elon Musk would argue it is far more dangerous than allowing ads and posts with false or misleading information to be seen by social media users. A clear distinction must be made between free speech, which allows individuals to express opinions and ideas, and hate speech, which incites violence or discrimination against others based on characteristics like race, religion, or gender. While free speech promotes dialogue, hate speech undermines societal harmony by fostering hostility. The effect of unregulated censorship is not only illegal but threatening to democracy. Therefore, transparency and accountability are key. The process of content removal from platforms must be clearly explained to its users and must be monitored by an independent body with regular checks and balances. Clear policies and protocols need to be established for these bodies, ensuring that the process of flagging keywords for removal includes an appeals system. Additionally, these guidelines should be easily accessible to all platform users.
3. Legally non-binding
The Framework for Cooperation is not legally binding; instead, it relies on the good faith of its participants to work together to ensure free and fair elections. It was developed in line with the Constitution of South Africa, the Electoral Act 73 of 1998 (Electoral Act) and the Electoral Code of Conduct.7 However, there are no clear consequences if media companies fail to meet their responsibilities to prevent hate speech, incitement to violence, and disinformation during election periods. The lack of accountability of Big Tech companies will limit the effective change that the IEC hopes to see. Without any legal repercussions, the motivation of Big Tech corporations to curb disinformation on their platforms remains minimal.
Conclusion
The Framework of Cooperation between the Big Tech corporations, IEC and MMA is a steppingstone in the right direction in tackling disinformation online during election periods. However, there are clear shortfalls of the Framework and without them being properly addressed, it will be very difficult for the IEC and MMA to limit the spread of disinformation on these platforms in South Africa. The Framework should adopt punitive measures to hold these corporations accountable for any disinformation spread on their platforms which would further incentivise them to comply with the Framework Agreement. Although Big Tech companies were signatories to the agreement, they ultimately failed to fully uphold their commitments to uphold free and fair elections. Education on digital harms and disinformation must be elevated to become an area of priority as it would mean that disinformation circulating on any platform in South Africa would have mitigated impacts as users will be able to identify it and thus, be more skeptical and less susceptible to disinformation and misinformation A key issue that was identified by the LRC investigations was that TikTok, X, Google, and Meta have failed to equitably implement their own content moderation policies within the South African context despite vague promises being made to ready themselves for the May 2024 election.. Big Tech corporations must invest in adequate and effective content moderation tools within the language and cultural context of every country they operate in, including South Africa and regulators in South Africa should strongly consider regulation that balance the safety of online users and the public against the rights to freedom of speech.