Adobe Beta-Testing New Tool to Detect Manipulated Images
October 23, 2020
Adobe released a beta version of a Photoshop tool that will make it easier to determine if an image is real or has been manipulated. The so-called attribution tool, which will first be tested with a select group of people, enables photo editors to attach more detailed, secure metadata to images. In addition to including who created the image, the metadata will provide information on how it was altered and if AI tools were used to do so. Adobe said it will also be clear if the metadata has been tampered with. This could be a step toward combatting deepfakes.
CNN reports that Adobe general counsel Dana Rao stated, “if you have something that you want people to believe is true, then this is a tool to help you get people to believe in it.” The new tool will be part of Adobe’s “Content Authenticity Initiative to fight against dis- and misinformation,” which the company launched a year ago with Twitter and The New York Times.
The tool will, at first, be used for “publishing still images to Behance, an Adobe-owned social network for sharing creative work … [but] the company hopes this kind of authenticating information will be added to different types of content and be shared widely on social media platforms and through media companies.”
According to Adobe, 90+ percent of creative professionals use Photoshop and 23 million use Behance. UC Berkeley professor Hany Farid, who specializes in digital forensics and was an unpaid advisor to Adobe for the authenticity initiative, “thinks the company’s approach makes sense because it makes the content producer responsible for making the image trustworthy, rather than leaving it to each individual viewer to sort out.”
SUNY Buffalo professor Siwei Lyu, who studied under Farid, believes the new tool will be “more effective than the limited metadata that may be connected to images today.” “This should have happened a long time ago,” he said.
Rao added that, “the public is going to have to understand they should expect to see this metadata if they want to trust these things.” The tool is optional, however, limiting its reach, and Farid pointed out that, “Adobe’s content-attribution tool can’t confirm the veracity of an image before it is edited in Photoshop.”
To that end, Farid is a paid adviser to “photo- and video-verification startup Truepic, which is part of the Content Authenticity Initiative.” It teamed up with Qualcomm “to create a way to securely snap pictures via a smartphone’s built-in camera app.” “It’s taken us years to get to this problem and it will take us years to get out of it,” said Farid.
ZDNet reports that, according to a blog post from Adobe vice president Will Allen, who oversees the Content Authenticity Initiative, “the tool is built using an early version of the open standard that will provide a secure layer of tamper-evident attribution data to photos, including the author’s name, location and edit history.” The Content Credentials panel, “a palette that pops up in the Adobe UI just like brushes and other palettes,” allows the user to “turn on or off the metadata on the image.”
No Comments Yet
You can be the first to comment!
Sorry, comments for this entry are closed at this time.