A bipartisan group of senators has introduced a new bill designed to enhance the detection and authentication of artificial intelligence-generated content, safeguarding journalists and artists from unauthorized use of their work by AI models.
The COPIED Act Overview
The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act) mandates the National Institute of Standards and Technology (NIST) to develop standards and guidelines for proving content origin and detecting synthetic media through methods like watermarking. Additionally, the act requires AI tools that create or handle journalistic and creative content to include origin information and forbids the removal of this data. Importantly, the bill stipulates that such content cannot be used to train AI models.
Legal Protections and Enforcement
Under the COPIED Act, content owners, including broadcasters, artists, and newspapers, can take legal action against entities that use their materials without permission or tamper with authentication markers. The bill also empowers state attorneys general and the Federal Trade Commission to enforce its provisions, specifically prohibiting the removal, disabling, or tampering with content provenance information, with exceptions made for certain security research purposes.
Broad Support and Legislative Context
This initiative is part of a broader legislative effort to regulate AI, spearheaded by Senate Majority Leader Chuck Schumer (D-NY), who has been advocating for a comprehensive AI roadmap. However, the details of new laws will be hashed out in individual committees. The COPIED Act is backed by influential leaders, including Senate Commerce Committee Chair Maria Cantwell (D-WA), Senate AI Working Group member Martin Heinrich (D-NM), and Commerce Committee member Marsha Blackburn (R-TN).
Industry Support
The introduction of the COPIED Act has garnered praise from various publishing and artists’ organizations, including SAG-AFTRA, the Recording Industry Association of America, the News/Media Alliance, and the Artist Rights Alliance.
“The capacity of AI to produce stunningly accurate digital representations of performers poses a real and present threat to the economic and reputational well-being and self-determination of our members,” stated Duncan Crabtree-Ireland, SAG-AFTRA national executive director and chief negotiator. “We need a fully transparent and accountable supply chain for generative Artificial Intelligence and the content it creates to protect everyone’s basic right to control the use of their face, voice, and persona.”
With these measures, the COPIED Act aims to ensure the integrity and authenticity of digital content, providing essential protections in an increasingly AI-driven world.