The authentication of information lies at the core of our legal system, our democracy, and many other aspects of our society. Is a photograph real, or has it been doctored? What about a video? Can we believe what we see? When the apparent authenticity of a piece of information can too easily be cast into doubt—and there is no accepted means to verify its provenance and reliability—we face a society-wide crisis. Coupled with the erosion of trusted sources, and fueled by current developments in machine learning, the proliferation of automated methods for fabricating information (so-called “deep fakes”) represent a new stage in the “arms race for truth.” Indeed, one of the most significant unintended consequences of AI advances may be their use as a powerful weapon in this struggle. Escaping the arms race dynamic will require the development and deployment of technical, social, and legal countermeasures. What should those countermeasures be? And who should deploy them? Our workshop aims to explore such questions.
To untangle the mathematical, computer science, sociological, legal, and policy issues and begin to craft practical interventions, we will bring together people with a diverse set of expertise such as:
—How to create fake images, video and audio, and how to detect such fakery;
—How cryptographic methods might proactively ensure the authenticity of information and make tampering or fabrication easier to detect;
—How instances of fakery spread as social phenomena and undermine consensus understandings;
—How the proliferation of fabricated information affects broader civic discourse, and to what extent individual education and/or market, legal, or technical interventions can counter harmful effects;
—How ex ante statutes or regulations and/or ex post penalties might alter malicious actors’ incentives;
—How national and global freedom of expression considerations, dignitary rights, privacy law (both statutory and common law), and bodies of existing legal code such as the rules of evidence might enhance our understanding of the issues and/or inform potential interventions.
This workshop will include a poster session; a request for posters will be sent to registered participants in advance of the workshop.
*Image from FACE SWAP THE MOVIE by David Gidali and Einat Tubi, licensed under CC BY-NC-SA
(University of California, Los Angeles (UCLA))
Mark Green (University of California, Los Angeles (UCLA))
Alicia Solow-Niederman (Independent Researcher)