The Material and the Virtual in Photographic Histories
Kris Belden-Adams, ThisTheoryDoesNotExist: Examining and Historicizing Artificial-Intelligence-Generated, Hyper-Realistic, DeepFake Photographs as ‘Data Portraits’
Thispersondoesnotexist.com offers a refreshable, seductively realistic series of digital portraits of exactly that: amalgamated images of fictional people. Built from an unknown number of Flickr photographs, these photographically hyper-realistic images enjoy the appearance of veristic “truth,” yet are framed by their status as synthetic products generated by Artificial Intelligence, or A.I. Like other images generated using A. I. algorithms, thispersondoesnotexist is known as a “DeepFake” generator. Thispersondoesnotexist, its spinoffs (thisAirBnBdoesnotexist, thesecatsdonotexist, thiswaifudoesnotexist.net, and thisstartupdoesnotexist), FakeApp, DeepFaceLab, DeepNude, and a proliferation of others, create images and videos so seemingly realistic using an archive of materials that they hardly – if at all – can be distinguished from photographs of real people. This technology, currently in its adolescence, is feared by many for its capacity to create “fake news.” This technology fed fears of a digitally-kindled, “post-truth,” fake news, era of “alternative facts,” and widespread information illiteracy. This paper examines the recent phenomenon of A.I.-generated DeepFakes and looks past the anxieties they raise to address them as extensions of digital photo/video montage practices that predate the digital era (even if the use of A.I. [human-generated algorithms] to make them is new). The emergence of digital media simply calls us to the task of articulating the complicated nature of “data portraits,” and ones that may be produced independently by computers following human-provided directives.