Home Business Photography Is No Longer Evidence of Anything

Photography Is No Longer Evidence of Anything

0
Photography Is No Longer Evidence of Anything

[ad_1]

For weeks now, the world has been awash in conspiracy theories spurred by weird artifacts in a photographic image of the missing Princess of Wales that she eventually admitted had been edited. Some of them got pretty crazy, ranging from a cover-up of Kate’s alleged death, to a theory that the Royal Family were reptilian aliens. But none was as bizarre as the idea that in 2024 anyone might believe that a digital image is evidence of anything.

Not only are digital images infinitely malleable, but the tools to manipulate them are as common as dirt. For anyone paying attention, this has been clear for decades. The issue was definitively laid out almost 40 years ago, in a piece cowritten by Kevin Kelly, a founding WIRED editor; Stewart Brand; and Jay Kinney in the July 1985 edition of The Whole Earth Review, a publication run out of Brand’s organization in Sausalito, California. Kelly had gotten the idea for the story a year or so earlier when he came across an internal newsletter for publisher Time Life, where his father worked. It described a million-dollar machine called Scitex, which created high-resolution digital images from photographic film, which could then be altered using a computer. High-end magazines were among the first customers: Kelly learned that National Geographic had used the tool to literally move one of the Pyramids of Giza so it could fit into a cover shot. “I thought, ‘Man, this is gonna change everything,’” says Kelly.

The article was titled “Digital Retouching: The End of Photography as Evidence of Anything.” It opened with an imaginary courtroom scene where a lawyer argued that compromising photos should be excluded from a case, saying that due to its unreliability, “photography has no place in this or any other courtroom. For that matter, neither does film, videotape, or audiotape.”

Did the article draw wide attention to the fact that photography might be stripped of its role as documentary proof, or the prospect of an era where no one can tell what’s real or fake? “No!” says Kelly. No one noticed. Even Kelly thought it would be many years before the tools to convincingly alter photos would become routinely available. Three years later, two brothers from Michigan invented what would become Photoshop, released as an Adobe product in 1990. The application put digital photo manipulation on desktop PCs, cutting the cost dramatically. By then even The New York Times was reporting on “the ethical issues involved in altering photographs and other materials using digital editing.”

Adobe, in the eye of this storm for decades, has given a lot of thought to those issues. Ely Greenfield, CTO of Adobe’s digital media business, rightfully points out that long before Photoshop, film photographers and cinematographers used tricks to alter their images. But even though digital tools make the practice cheap and commonplace, Greenfield says, “treating photos and videos as documentary sources of truth is still a valuable thing. What is the purpose of an image? Is it there to look pretty? Is it there to tell a story? We all like looking at pretty images. But we think there’s still value in the storytelling.”

To ascertain whether photographic storytelling is accurate or faked, Adobe and others have devised a tool set that strives for a degree of verifiability. Metadata in the Middleton photo, for instance, helped people ascertain that its anomalies were the result of a Photoshop edit, which the Princess owned up to. A consortium of over 2,500 creators, technologists, and publishers called the Content Authenticity Initiative, started by Adobe in 2019, is working to devise tools and standards so people can verify whether an image, video, or recording has been altered. It’s based on combining metadata with exotic watermarking and cryptographic techniques. Greenfield concedes, though, that those protections can be circumvented. “We have technologies that can detect edited photos or AI-generated photos, but it’s still a losing battle,” he says. “As long as there is a motivated enough actor who’s determined to overcome those technologies, they will.”



[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here