Browse Reports

 

 

The uses and abuses of Deepfake Technology, February 2022

Deepfake technology is essentially artificial intelligence capable of creating realistic but false videos, photos and audio of people. Not all deepfakes are so harmless, however, as the technology can, and has been, used to commit fraud, sexually harass women, exacerbate tensions and cause violence. With the increasing dependence on the internet for news as well as the speed of online communication, deepfakes will pose challenges to national security and public safety, to individuals, especially women, and governance of cyber-security.

 

 

Network Contagion Research Institute

These are an independent, data-driven, evidence-based series of reports that the NCRI and select partners release regarding the spread of hostile ideological content. One of the main goals of these reports is to handle sensitive social issues around the spread of ideology in an objective and data driven way. NCRI aims to facilitate honest conversations about the spread of political deception, hate and manipulation, especially on social media.

 

 

Brookings: Report on how to combat fake news and disinformation

In order to maintain an open, democratic system, it is important that government, business, and consumers work together to solve these problems. Governments should promote news literacy and strong professional journalism in their societies. The news industry must provide high-quality journalism in order to build public trust and correct fake news and disinformation without legitimizing them. Technology companies should invest in tools that identify fake news, reduce financial incentives for those who profit from disinformation, and improve online accountability.

 

 

Misinformation in Canada: Research and Policy Options, May 2021

Misinformation refers to false or misleading information. Disinformation, a subcategory of misinformation, is false information spread with intent to deceive. Both mis- and disinformation are ongoing problems that have been exacerbated by COVID-19. Evidence for Democracy completed a research project to characterize the research landscape in Canada and to provide options for addressing misinformation.

 

 

Next-Generation Technology and Electoral Democracy: Understanding the Changing Environment, Centre for International Governance Innovation

Rapid transformation of the digital sphere has created new and ever more insidious threats to democracy and the electoral process — on a global scale. Growing evidence of foreign influence operations combined with mounting worries over corporate surveillance, the power of platform monopolies and the capabilities of the dark web have challenged government and society in unprecedented ways. CIGI convened a transdisciplinary team of experts from fields such as computer science, law, public policy and digital communication to formulate a special report for key government and civil society stakeholders.

 

 

Submission to the UN Special Rapporteur on disinformation and freedom of opinion and Expression

Disinformation campaigns are a growing threat to global stability and democratic values, but in some countries, laws ostensibly aimed at countering such activities have been used to crack down on journalists and civil society groups. The increase in dissemination of disinformation by state and non-state actors in pursuit of financial, ideological and political goals is concerning. Manipulation of the information environment through the propagation of disinformation risks constraining the space available to democratic stakeholders, and particularly to marginalised groups, for authentic political expression.

 

 

Carnagie Endowment for international Peace: European Democracy and Counter-Disinformation: Toward a New Paradigm?

European governments are moving into a new phase in their efforts to counter disinformation. The recent project with The Hague Program for Cyber Norms looked at how the governments of several European countries (France, Germany, Hungary, Serbia, Sweden, and the United Kingdom) have adjusted their counter-disinformation strategies during the pandemic. This identified two major trends. First, governments are realizing that the distinction between domestic and foreign disinformation has become increasingly obsolete. Second, alongside their attempts to regulate online platforms, governments are starting to think more about the democratic character of their counter-disinformation measures.

 

 

UNICEF: Digital misinformation / disinformation and children Rapid analysis | How can we best protect children from the harms that stem from mis/disinformation?

The report goes beyond simply trying to understand the phenomenon of false and misleading information, to explain how policymakers, civil society, tech companies and parents and caregivers can act to support children as they grow up, pushing back the rising tide of misinformation and disinformation.