YouTube is introducing a new tool in Creator Studio that requires creators to disclose when realistic content, potentially mistaken for real, is made with altered or synthetic media, including generative AI. The disclosure labels will appear in descriptions or on the video player. Exceptions include clearly unrealistic content or when generative AI is used for productivity. The aim is to enhance transparency and trust between creators and viewers. Labels will roll out gradually, with enforcement measures planned for non-disclosing creators. YouTube also collaborates with industry initiatives for content authenticity and is working on a privacy process for requesting removal of AI-generated or altered content.
For further actions, you may consider blocking this person and/or reporting abuse
Discussion (3)
I wonder how accurate this will be, if something is AI and the creator doesn't state it (weird behavior), if Youtube would be able to catch it.
Probably really hard to police at scale my guess is they would fall back on the community to report it
A decent reaction to both EU laws as well as fighting off misinformation to more naive audiences.