Traditional media’s role as the fourth arm of the state is being augmented or complemented by new media formats like social media and websites, to mention a few. Equally significant is the notion that social media and websites are complementing traditional media’s normative watchdog role. However, there are growing concerns over how the technology-driven new media are deviating from journalism standards, while failing to take responsibility for what people publish online. A key concern is the active use of new media to promote fake news, misinformation and disinformation associated with general elections even in the most advance countries like the United States of America.
According to the World Economic Forum, more than two billion voters across 54 countries will be holding general elections in 2024. Some of the countries include the United States, several European Union countries and India, the world’s biggest democracy. Ghana and other 15 African countries are among the 54 countries that will be holding crucial elections this year.
Weaponising elections
Nearly a decade after social media was weaponised to influence election outcomes and with the rapid technological advancements, it is anticipated that new media will have even greater influence in the 54 elections slated for 2024. It is anticipated that in all the 54 elections including Ghana, social media, bloggers and citizen journalists will play a crucial role in campaigning and calling results, which may not be the authenticated by electoral bodies. Potentially, this development – if left unchecked – could become the bedrock for election-related violence and conflicts in some countries where democratic elections is still evolving.
This makes it mandatory for technology platforms and governments to do everything in their power to safeguard elections and uphold democratic values online. Several stakeholders are worried that in a world without standardised global social media regulation, ensuring safe elections and regulated online and offline media spaces will require key actions to be taken ahead of any impending elections. Thus, not only is digital democracy and its potential threat to peace and security a cause for worry for developing countries, but it is also a nagging headache for advanced countries like the United States of America. Since the controversial 2016 elections, the American election system and democratic process have been under continuous informational assault.
In a recent report, the Center for American Progress anticipates risks to and from the major social media platforms in the impending 2024 elections. The report recommended the introduction of new strategies to encourage technology platforms to safeguard democratic processes and mitigate election threats through fake online media that are used to disseminate disinformation and misinformation. A similar fake news media content could be replicated in Ghana, which is already inundated by fake news and disinformation through social media. The government and state media regulatory bodies have had cause to complain about the rise of fake news and disinformation on the media landscape in Ghana. It is anticipated that in the heat of the 2024 election, AI and fake news will take over the media landscape and could even involve mainstream media and traditional journalists, which could be selling their content to the highest bidder.
That said, it is worth noting that mitigating and addressing these threats to digital democracy is not solely the responsibility of technology platforms and social media companies. States and governments have equal responsibility to sanitise the social media landscape. Sadly, however, research has shown that both government and oppositions political leaders have used and could use social media to call elections or dispute election results to promote violence during and after elections.
Power of artificial intelligence
Moreover, the emerging power of artificial intelligence has become a worry to many people, especially powerful people in society who tend to become victims of AI. In recent months, artificial intelligence (AI) has emerged as a new vector for potential harm to democratic and fair elections.
Technology experts say AI has significant potential to exacerbate existing threats such as bots, harassment and disinformation. This makes it increasingly more difficult to accurately detect manipulated media, also known as deepfake content. Perhaps, most disturbing is the impact this rapidly advancing technology may have on threats that are yet to come to light, including those related to 2024 elections. The fear hinges on the potential for AI technology to be used to influence elections results. In Ghana, news is circulating on secrete plans by some political elements to undermine the electoral process through the combined use of AI and social media platforms to sway uninformed voters. Surely, attempts to hack the databases of the electoral systems in many countries cannot be ruled out in elections that many analysts describe as ‘do and die’. In the 2020 general elections in Ghana, reports indicate that one presidential candidate spent a colossal two million dollars on social media campaigning and election-related activities. This makes social media a strategic election machinery that is probably outpacing the normative roles of mainstream media and traditional journalism in political communications.
Policy guidelines
Perhaps, the most important set of recommendations to help social media platforms to uphold democratic processes involves shoring up the policies, processes and protocols during and after elections. These systems could protect users against fake news, to project accurate and relevant election information, and to enable critical emergency mitigations.
Some experts have recommended that as a way of mitigating the potential effects of AI on election outcomes, AI platforms should develop and articulate clear and defensible criteria for deploying election risk protection in emergency situations. These mitigations may include entire product surfaces such as Instagram reels, YouTube video recommendations or Facebook’s “Popular Near You” content. Others include the product policy exceptions, adjustments to algorithmic ranking or other significant changes.
Furthermore, platforms should consistently apply civic integrity policies to all content formats as a means of combating election disinformation. Also, platforms should critically review reports and content originating from accounts representing a candidate, politician, party, elected official or government account. Besides, platforms should block or debunk material generated from AI, such as deepfakes content within a certain time frame of it being reported.
Process and protocol
With regards to processes technology, it is suggested that companies employ strategies for the pre-election period and election day, as well as the time periods following poll closures. But that should be done before winners have been declared and after election certification and the transfer of power. Each of these time periods has unique risks and potential harm to the outcome of elections. This enables election authorities to quickly escalate content that may be illegal or otherwise pose informational integrity risks to users during sensitive election periods.
It is also recommended that platforms should prioritise the fact-checking of electoral content, including political advertisements and content originating from public officials. This should be done by utilising a mixture of tools such as third-party fact-checking programmes and community notes. They should bolster accurate information workstreams on all election-related content, including election information centres with polling place data.
Democratic legitimacy
The threats to democratic legitimacy are twofold; they are either procedural threats or perceptive threats. As explained earlier, there are tangible procedural threats to elections in many countries across the world. These include the refusal of some partisan officials to certify results, voter suppression and tampering with results from aggrieved election workers.
The perceptive threats to the legitimacy of the election and sustainability of democracy rest partly on social trust and public perception of election integrity. It is worth noting that whenever public perception of an election diverges from reality, it causes friction that threatens to destabilise democracy. On the other hand, an illegitimate election perceived as legitimate, and a legitimate election perceived as illegitimate both present catastrophic democratic consequences.
Therefore, social media and AI companies have a significant role to play in ensuring that democratic legitimacy is not undermined in the court of public opinion. If platforms allow their digital media to stoke confusion and sow doubt about secure election processes, they are consciously undermining democracy and the electoral systems across the world. This is because the power of technology is not restricted to boundaries; they can be applied to bad or good effects anywhere in the world. Beyond that, social media companies need to introduce mechanisms to slow the spread of election disinformation, misinformation and fake news.
Finally, social media companies that fail to disrupt election subversion strategies, but rather provide the necessary tools to carry it out, are complicit in these assaults on democracy everywhere. But ensuring sanity in the social media and AI now and during elections is not a matter for social media and AI companies alone to decide; states and governments must ensure that their media laws on digital communication are enforced. Where such laws do not exist, the necessary steps must be taken to make new legislation to guide the operations of new media platforms. While the rapid development of the new media appears to be unstoppable, states and governments should develop guidelines and other regulatory frameworks to harness the potential benefit of the technology media for national development.
The post Social media threats to democratic legitimacy appeared first on The Business & Financial Times.
Read Full Story
Facebook
Twitter
Pinterest
Instagram
Google+
YouTube
LinkedIn
RSS