For the last several months, I’ve been participating in a conversation about AI generative images. This working group, made up of photojournalists, editors, intellectuals, attorneys and technical experts, has come together to look for solutions to an existential threat to journalism – that the public may come to not trust anything they see anymore.
Moving the needle on this is brutally difficult, but our first call to action is relatively simple – if you are a journalism organization or an individual visual journalist, make it clear where you stand. What are your ethical guidelines? And can the general public find them easily?
Most news organizations have established codes of ethics. Some, like the National Press Photographers Association’s code, are broad enough to forbid the creation or usage of generative imagery. But this is a time to be explicit. In a recent piece on Vanity Fair, Fred Ritchin (a founder of the Writing with Light group) denotes how images are already being questioned.
Adobe, who has been working on a technical solution via its Content Authentication Initiative, apparently has had AI generated images inserted into its stock “photo” library that come up with searches for Israel and Gaza.
So, my statement: Photograph means to write with light. In order to be a photograph, light must have come from a source, bounced off a moment in time and been recorded. Any alteration, addition or subtraction from that recording of that moment in time from that particular angle is no longer a photograph. Call it an illustration or a work of art, but if it is not written with light it is not a photograph.