Adobe is releasing an open source developer toolkit that aims to prevent the spread of visual misinformation by including additional metadata that Adobe calls Content Credentials. The system is also designed to help content creators indelibly tag authorship to their work. Announced in 2019, the Content Authenticity Initiative (CAI) project has released a whitepaper introducing the system, which is integrated into Adobe software. The CAI has teamed with hardware manufacturers and newsrooms to help ubiquitize its vision. The Associated Press, The New York Times and The Wall Street Journal have signed aboard.
“Detection [for misinformation] is going to be an arms race, and, you know, frankly, the good guys will lose,” CAI senior director Andy Parsons told TechCrunch. “We endeavored instead to double down on content authenticity, which is this idea of proving what’s real, how something was made, in cases where it makes sense, who made it,” Parsons explained.
This year sees the culmination of Adobe partnership collaborations on the open-source version of the underlying C2PA standard on which the new protocol is built. Those partners include Intel, Microsoft, BBC, Twitter, Qualcomm and Sony.
The public release of this new technical spec for digital provenance “provides apps and platforms with a blueprint defining a core trust model and its application to various types of content (e.g., images, videos, audio, or documents),” Parsons wrote on the CAI blog.
Adobe first released the Content Credentials feature in late 2021, making it available to Creative Cloud subscribers across key products (notably as an opt-in for Photoshop). “These tools help establish authorship for creators, foster transparency within digital media and bolster trust in content (images, video, other digital formats) by adding robust, tamper-evident provenance data about how a piece of content was produced, edited, and published,” Parsons wrote on the blog.
“With the new tools, a social media platform could plug in the Adobe-provided JavaScript and quickly have all of its images and videos displaying the content credentials, which appear as a mouse-over icon in the upper-right corner,” TechCrunch reports.
A digital chain of custody will also help with validation of content for NFT owners and creators who have had their work misappropriated, an increasing problem that also affects graphic artists. The C2PA guidelines provided by Adobe also offer guidance on how that information is presented and stored, and how verification is accomplished.
Parsons told TechCrunch the CAI has also received interest from companies that produce synthetic images, and videos along the lines of what’s possible from the DALL-E artificial intelligence program developed by OpenAI.
No Comments Yet
You can be the first to comment!
Sorry, comments for this entry are closed at this time.