YouTube has added new rules requiring those uploading realistic-looking videos that are “made with altered or synthetic media, including generative AI” to label them using a new tool in Creator Studio. The new labeling “is meant to strengthen transparency with viewers and build trust between creators and their audience,” YouTube says, listing examples of content that require disclosure as “likeness of a realistic person” including voice as well as image, “altering footage of real events or places” and “generating realistic scenes” of fictional major events, “like a tornado moving toward a real town.”
“YouTube defines realistic content as anything that a viewer could ‘easily mistake’ for an actual person, event or place,” writes Engadget, noting that “if a creator uses a synthetic version of a real person’s voice to narrate a video or replaces someone’s face with another person’s, they’ll need to include a label.”
A YouTube support page explains that “for most videos, this added transparency will appear in the expanded description.”
A disclosure is also required when modifying footage of a real event or place, like “making it appear as if a real building caught fire, or altering a real cityscape to make it appear different than in reality,” YouTube details in a blog post.
Google’s YouTube Help Center offers specific examples of what requires labeling (“digitally altering a famous car chase scene to include a celebrity who wasn’t in the original movie”) and what does not (“generating or extending a backdrop to simulate a moving car”).
The new disclosure rule relies on the honor system, requiring creators to self-label using a checkbox that appears in the upload dashboard. But YouTube says it may add a label itself, “even when a creator hasn’t disclosed it, especially if the altered or synthetic content has the potential to confuse or mislead people.”
In November, YouTube laid out its GenAI content policy, “essentially creating two tiers of rules: strict rules that protect music labels and artists and looser guidelines for everyone else,” writes The Verge, noting deepfake videos “can be taken down by an artist’s label if they don’t like it.”
As a private citizen, it will be harder to get misleading material pulled. “You’d have to fill out a privacy request form that the company would review,” The Verge notes, adding that “YouTube didn’t offer much about this process in today’s update,” saying it is continuing to work to update its privacy policies.
No Comments Yet
You can be the first to comment!
Leave a comment
You must be logged in to post a comment.