CHAPTER
7. Defending Democracy from Deepfake Deception Act of 2024
20510.
This chapter shall be known and may be cited as the Defending Democracy from Deepfake Deception Act of 2024.20511.
The Legislature finds and declares all of the following:(a) California is entering its first-ever generative artificial intelligence (AI) election, in which disinformation powered by generative AI will pollute our information ecosystems like never before. Voters will not know what images, audio, or video they can trust.
(b) In a few clicks, using current technology, bad actors now have the power to create a false image of a candidate accepting a bribe or a fake video of an elections official “caught on tape” saying that voting machines are not secure, or to generate the Governor’s voice telling millions of Californians
their voting site has changed.
(c) In the lead-up to the 2024 presidential elections, candidates and parties are already creating and distributing deepfake images and audio and video content. These fake images or files can spread to millions of Californians in seconds and skew election results or undermine trust in the ballot counting process.
(d) The labeling information required by this bill is narrowly tailored to provide consumers with factual information about the inauthenticity of particular images, audio, video, or text content in order to prevent consumer deception.
(e) In order to ensure California elections are free and fair, California must, for a limited time before and after elections, prevent the use of
deepfakes and disinformation meant to prevent voters from voting and to deceive voters based on fraudulent content. Accordingly, the provisions of this chapter are narrowly tailored to support California's compelling interest in protecting its free and fair elections.
20512.
For purposes of this chapter, the following terms have the following meanings:(a) “Advertisement” means any general or public communication that a large online platform knows is authorized or paid for with the purpose of supporting or opposing a candidate for elective office.
(b) “Broadcasting station” means a radio or television broadcasting station, including any of the following:
(1) Cable operator, programmer, or producer.
(2) Streaming service operator, programmer, or producer.
(3) Direct-to-home satellite television operator, programmer, or producer.
(c) “Candidate” means any person running for
a voter-nominated office as defined in Section 359.5, any person running for the office of President or Vice President of the United States, and any person running for the office of Superintendent of Public Instruction.
(d) “Deepfake” means audio or visual media that is digitally created or modified such that it would
falsely appear to a reasonable person to be an authentic record of the actual speech or conduct of the individual depicted in the media.
(e) “Election communication” means a general or public communication that is not an “advertisement” and that concerns any of the following:
(1) A candidate for elective office.
(2) Voting or refraining from voting in an election in California.
(3) The canvass of the
vote for an election in
California, as defined in subdivision (f).
(4) Voting machines, ballots, voting sites, or other property or equipment related to an election in California.
(5) Proceedings or processes of the electoral college in California.
(f) “Election in California” means any election where a candidate, as defined in this section, is on the ballot, and any election where a statewide initiative or statewide referendum measure is on the ballot.
(g) “Elections official” means either of the following persons acting in their official capacity:
(1) An elections official as defined in Section 320.
(2) The Secretary of State.
(h) “Large online platform” means a public-facing internet website, web application, or digital application, including a social media platform as defined in Section 22675 of the Business and Professions Code, video sharing platform, advertising network, or search engine that had at least 1,000,000 California users during the preceding 12 months.
(i) (1) “Materially deceptive content” means audio or visual media that is
digitally created or modified, and that includes, but is not limited to, deepfakes and the output of chatbots, such that it would falsely appear to a reasonable person to be an authentic record of the content depicted in the media.
(2) “Materially deceptive content” does not include any audio or visual media that contains only minor modifications that do not
significantly change the perceived contents or meaning of the content. Minor changes include changes to the brightness or contrast of images, removal of background noise in audio, and other minor changes that do not impact the content of the image or audio or visual media.
20513.
(a) Any large online platform shall develop and implement procedures for the use of state-of-the-art
techniques to identify and remove materially deceptive content
if all of the following conditions are met:(1) The content is reported pursuant to subdivision (a) of Section 20515.
(2) The materially deceptive content is any of the following:
(A) A candidate for elective office portrayed as doing or saying something that the candidate did not do or say and that is reasonably likely to harm the reputation or electoral prospects of a candidate.
(B) An elections official portrayed as doing or saying something in connection with the performance of their elections-related duties that the elections official did not do or say and that is reasonably likely to falsely undermine confidence in the outcome of one or more election contests.
(C) An elected official portrayed as doing or saying something that influences an election in California that the elected official did not do or say and that is reasonably likely to falsely undermine confidence in the outcome of one or
more election contests.
(3) The content is posted during the applicable time period or periods set forth in subdivision (e).
(4) The large online platform knows or acts with reckless disregard for the fact that the content meets the requirements of this section.
(b) If a post is determined to meet the requirements for removal pursuant to subdivision (a), any large online platform shall remove the post upon that determination, but no later than 72 hours after a report is made pursuant to subdivision (a) of Section 20515 in order to be in compliance with this chapter.
(c) Any large online platform shall identify, using state-of-the-art techniques, and remove, upon discovering or being alerted to the posting or reposting of, any identical or substantially similar materially deceptive content that the platform had previously removed pursuant to this chapter, provided that this removal occurs during the applicable time period or periods set forth in subdivision (e).
(d) (1) Notwithstanding subparagraph (A) of paragraph
(2) of subdivision (a), this section does not apply to a candidate for elective office who, during the time period set forth in subdivision
(e), portrays themself as doing or saying something that the candidate did not do or say, if the digital content includes a disclosure stating the following: “This _____ has been manipulated.” The blank in this disclosure shall be filled in with whichever of the following terms most accurately describes the media:
(A) Image.
(B) Audio.
(C) Video.
(2) (A) For visual media, the text of the disclosure shall appear in a size that is easily readable by the average viewer and no smaller than the largest font size of other text appearing in the visual media. If the visual media does not include any
other text, the disclosure shall appear in a size that is easily readable by the average viewer. For visual media that is video, the disclosure shall appear for the duration of the video.
(B) If the media consists of audio only, the disclosure shall be read in a clearly spoken manner and in a pitch that can be easily heard by the average listener, at the beginning of the audio, at the end of the audio, and, if the audio is greater than two minutes in length, interspersed within the audio at intervals of not greater than two minutes each.
(e) (1) Except as provided in paragraph (2), any large online platform shall
remove the content to the extent required by subdivisions (a) to (c), inclusive, and any candidate for elective office shall include the disclosure required by subdivision (d), during a period beginning 120 days before an election in California and through the day of the election.
(2) If the content described in subdivision (a) depicts or pertains to elections officials, any large online platform shall
remove the content to the extent required by subdivisions (a) to (c), inclusive during a period beginning 120 days before an election in California and ending on the 60th day after the election.
20514.
(a) Any large online platform shall develop and implement procedures for the use of state-of-the-art techniques to identify materially deceptive content and for labeling such content as provided in subdivision (c) if all of the following conditions are met:(1) The content is reported pursuant to subdivision (a) of Section 20515.
(2) The materially deceptive content is either of the following:
(A) Included within subdivision (a) of Section 20513, but is posted outside the applicable time period described in subdivision (e) of Section
20513.
(B) Appears within an advertisement or election communication and is not subject to Section 20513.
(3) The large online platform knows or acts with reckless disregard for the fact that the materially deceptive content meets the requirements of this section.
(b) If a post is determined to meet the requirements for labeling pursuant to subdivision (a), any large online platform shall label the post
upon that determination, but no later than 72 hours after a report is made pursuant to subdivision (a) of Section 20515 in order to be in compliance with this chapter.
(c) The label required by subdivision (a) shall state: “This _____ has been manipulated and is not authentic.” The blank in this disclosure shall be filled in with whichever of the following terms most accurately describes the media:
(1) Image.
(2) Audio.
(3) Video.
(d) The label required by subdivision (a) shall permit users to click or tap on it
for additional explanation about the materially deceptive content in an easy-to-understand format.
(e) The labeling requirement set forth in subdivision (a) applies during any of the following time periods, to the extent applicable:
(1) The period beginning six months before an election in California and through the day of the election.
(2) The period beginning six months before an election in California and ending on the 60th day after the
election, if the content depicts or pertains to elections officials, the electoral college process, a voting machine, ballot, voting site, or other equipment related to an election, or the canvass of the vote.
20515.
(a) A large online platform shall provide an easily accessible way for California residents to report to that platform content that should be removed pursuant to Section 20513 or labeled pursuant to Section 20514. The large online platform
shall respond to the person who made the report within 36 hours of the report, describing any action taken or not taken by the large online platform with respect to the content.(b) A candidate for elective office, elected official, or elections official who has made a report to a large online platform under subdivision (a) and who either has not received a response within 36 hours or disagrees with the response, action taken, or failure by the large online platform to take action within 72 hours, may seek injunctive or other equitable relief against the large
online platform to compel
the removal of specific content as required by Section 20513, labeling of specific content as required by Section 20514, or compliance with the reporting process required by subdivision (a). The plaintiff shall bear the burden of establishing the violation through clear and convincing evidence. An action under this subdivision shall be entitled to precedence in accordance with Section 35 of the Code of Civil Procedure.
20516.
The Attorney General or any district attorney or city attorney may seek injunctive or other equitable relief against any large online platform to compel the removal of specific content as required by Section 20513, labeling of specific content as required by Section 20514, or compliance with the reporting process required by subdivision (a) of Section 20515. The plaintiff shall bear the burden of establishing the violation through clear and convincing evidence. An action under this section shall be entitled to precedence in accordance with Section 35 of the Code
of Civil Procedure.20517.
This chapter applies to materially deceptive content, regardless of the language used in the content. If the language used is not English, the disclosure required by subdivision (d) of Section 20513 and the label required by Section 20514 must appear in the language used as well as in English.20518.
(a) This chapter does not preclude a large online platform from blocking, removing, or labeling any materially deceptive content outside of the time periods specified in Sections 20513 and 20514.(b) This chapter does not preclude any online platform not subject to this chapter from
blocking, removing, or labeling any materially deceptive content.
20519.
This chapter does not apply to any of the following:(a) A regularly published online newspaper, magazine, or other periodical of general circulation that routinely carries news and commentary of general interest, and that publishes any materially deceptive content that an online platform is required to block or label based on this chapter, if the publication contains a clear disclosure that the materially deceptive content does not accurately represent any actual event, occurrence, appearance, speech, or expressive
conduct.
(b) (1) A broadcasting station that broadcasts any materially deceptive content prohibited by this chapter as part of a bona fide newscast, news interview, news documentary, commentary of general interest, or on-the-spot coverage of bona fide news events, if the broadcast clearly acknowledges through content or a disclosure, in a manner that can be easily heard or read by the average listener or viewer, that the materially deceptive content does not accurately represent any actual event, occurrence, appearance, speech, or expressive conduct.
(2) A broadcasting station when it is paid to broadcast materially deceptive content and either of the following circumstances exist:
(A) The broadcasting station can
show that it has prohibition and disclaimer requirements that are consistent with the requirements in this chapter and that it has provided those prohibition and disclaimer requirements to each person or entity that purchased the advertisement.
(B) Federal law requires the broadcasting station to air advertisements from legally qualified candidates or prohibits the broadcasting station from censoring or altering the message.
(c) Materially deceptive content that constitutes satire or parody.
20520.
The provisions of this chapter are severable. If any provision of this chapter or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.