Think you don’t need to care about AI?
Even if a CSO has little interest in harnessing AI, and is not concerned about the possible impact the technology might have on the organisation or its operating environment, it should be aware of the potential for AI to create new challenges for the people and communities it serves. If many existing social and environmental issues increasingly become technology issues as well in the future, but the CSOs whose missions are to address those problems fail to adapt:: then civil society will fall short in its duty.
If, however, civil society rises to the challenge by engaging with these issues now, it can play an absolutely vital role: both in leading the debate about how AI can be developed to minimise the risks of damage to our society, and in ensuring that CSOs are well-placed to deal with any future challenges that cannot be avoided. In this section, we outline some of the possible future challenges that AI might bring, and the role civil society could play in minimising them.
Autonomous weaponry
The risks posed by the deliberate, malicious use of AI were starkly highlighted in a recent paper. This report outlined a range of challenges and scenarios, including the development of autonomous weapons, the use of AI for enhanced cyber-warfare, and the micro-targeting of propaganda and misinformation to undermine elections and the wider democratic process.
When it comes to autonomous weapons, the potential advantage to be gained from perfecting them is so great that there is a huge incentive to devote significant resources into research and development. And the concern is that this applies to malign actors and rogue states just as much as it does to recognised military powers. Given the historic role that civil society has played in advocating against militarisation - from campaigning against the use of nuclear weapons to calling for bans on the use of landmines and cluster munitions – it is critical that it stays abreast of developments in this field and is able to raise concerns.
Fake News, targeted propaganda and democracy
Another challenge, which spans both malicious intent and unintended consequences, is the impact of targeted misinformation in the form of ‘fake news’ and propaganda. Analysis of vast data sets on previous behaviour and social interaction has enabled the creation of algorithms that can target tailored information to an incredibly granular degree (down to the level of neighbourhoods, households and even individuals). This has been used by some organisations simply for commercial gain, but others have employed it with the deliberate aim of influencing election processes and subverting democracy. The scandal that has erupted in early 2018 as a result of the revelations about the relationship between Facebook and Cambridge Analytica, who specialised in this kind of targeting, highlights the scale of this problem.
The coarsening of public discourse and the devaluation of notions of truth and fact pose significant challenges for civil society. Many organisations seek to work across societal divisions, but if people are unwilling to engage with others outside their immediate circle this will prove increasingly difficult. Similarly, CSOs often rely on evidence or expertise to support their advocacy work. If those in power or with vested interests in resisting change are able to question this evidence or to counter it with claims of their own, and there is no clear sense of objective fact, then the ability of civil society to campaign for social change will be hugely diminished.
Civil society can also help to challenge the culture of misinformation and fake news. Whether through supporting new models of funding journalism or through acting as focal points for efforts to build community cohesion and overcome differences, CSOs can play a key role in ensuring that issues are brought to light in a way that is fair, evidence-based and constructive.