Bill Text

Bill Information


PDF |Add To My Favorites |Track Bill | print page

AB-1545 Children: internet safety: platform operators.(2021-2022)

SHARE THIS:share this bill in Facebookshare this bill in Twitter
Date Published: 04/12/2021 09:00 PM
AB1545:v98#DOCUMENT

Amended  IN  Assembly  April 12, 2021

CALIFORNIA LEGISLATURE— 2021–2022 REGULAR SESSION

Assembly Bill
No. 1545


Introduced by Assembly Member Wicks

February 19, 2021


An act to add Title 1.81.7 (commencing with Section 1798.300) to Part 4 of Division 3 of the Civil Code, relating to business.


LEGISLATIVE COUNSEL'S DIGEST


AB 1545, as amended, Wicks. Children: internet safety: platform operator: prohibited acts. operators.
Existing law, the California Consumer Privacy Act of 2018, (CCPA) 2018 (CCPA), grants consumers certain rights in regard to businesses, as defined, that collect personal information about them, including the right to know what information is collected and the right to opt-out of the sale of that information. Existing law, the California Privacy Rights Act of 2020, approved by the voters as Proposition 24 at the November 3, 2020, statewide general election, among other changes in the CCPA, establishes the California Privacy Protection Agency and vests it with full administrative power, authority, and jurisdiction to implement and enforce the CCPA. Existing law prohibits specified unfair, dishonest, deceptive, destructive, fraudulent, and discriminatory practices by which fair and honest competition is destroyed or prevented.
Existing law prohibits a business from selling personal information of consumers that the business knows are less than 16 years of age unless that sale is affirmatively authorized, as specified. Existing law, commencing January 1, 2023, also prohibits a business from sharing personal information of consumers that the business knows are less than 16 years of age unless that sale is affirmatively authorized, as specified.
This bill would enact the Kids Internet Design and Safety Act for purposes of keeping children safe and protecting their interests on the internet. The bill would would, among other things, prohibit an operator of a platform directed to children, platform, as defined, from incorporating certain features on any of its platforms, including, but not limited to, with respect to content viewable by a covered user, as defined, without first obtaining consent from the parent or guardian of the covered user, including an auto-play setting that, without input from a covered user, as defined, user, commences additional video content directly following the video content initially selected by the covered user.
This bill would prohibit an operator of a platform directed to children, or a platform for which the operator has constructive knowledge that covered users use the platform, from engaging in certain activities, including amplifying, promoting, or encouraging covered users’ consumption of videos or other forms of content that involve sexual material or other dangerous, abusive, exploitative, or wholly commercial content. The bill would also prohibit an operator of a platform directed to children from engaging in certain activities, including directing content that includes host-selling, as defined, to covered users. from promoting, amplifying, or otherwise encouraging the consumption of content or advertising that involves, among other things, sexual material. The bill would provide that a violation of these provisions constitutes unfair competition.
This bill would, for age verification purposes, would require an operator of a platform to use only age verification information collected from covered users. to, among other things, allow a parent or guardian to create an account for that person’s child who is under 13 years of age. The bill would would, beginning January 1, 2026, require the Attorney General to adopt, by January 1, 2026, specified regulations relating to this bill, including conduct an annual audit process for purposes of evaluating the level of compliance with these provisions. The bill would require the annual audit to be conducted for 25 of 10 of the platforms that are directed to children and that have the highest total number of covered users in the past previous calendar year. The bill would require the first annual audit to be completed by January 1, 2027. The bill would make related findings and declarations. year, as specified.
Vote: MAJORITY   Appropriation: NO   Fiscal Committee: YES   Local Program: NO  

The people of the State of California do enact as follows:


SECTION 1.

 The Legislature finds and declares all of the following:
(a) Children increasingly consume digital entertainment on the internet and are uniquely susceptible to manipulation online given their lack of important neurological and psychological mechanisms that develop later in adulthood.
(b) Artificial intelligence, machine learning, and other complex systems are used to make continuous decisions about how online content for children can be personalized to increase engagement.
(c) Online companies gather, analyze, and use data for behavioral marketing directed at children.
(d) Companies employ sophisticated strategies, including neuromarketing, to affect consumer behavior and manipulate decisionmaking.
(e) Branded content in various forms of multimedia, including native advertising and influencer marketing, exposes children to marketing that is inherently manipulative or purposely disguised as entertainment or other information.

SECTION 1.SEC. 2.

 Title 1.81.7 (commencing with Section 1798.300) is added to Part 4 of Division 3 of the Civil Code, to read:

TITLE 1.81.7. Kids Internet Design and Safety Act

1798.300.
 (a) This title shall be known, and may be cited, as the Kids Internet Design and Safety Act, Act or the KIDS Act.
(b) The purpose of this title is to keep children safe and protect their interests on the internet.

1798.301.

The Legislature finds and declares all of the following:

(a)Children increasingly consume digital entertainment on the internet and are uniquely susceptible to manipulation online, given their lack of important neurological and psychological mechanisms that develop later in adulthood.

(b)Today’s digital media environment, which is constantly evolving and now includes high-tech experiences, including augmented reality and virtual reality, is largely designed in a nontransparent manner to ensure children interact with content that reflect the interests and goals of content creators, platforms, and marketers.

(c)Artificial intelligence, machine learning, and other complex systems are used to make continuous decisions about how online content for children can be personalized to increase engagement.

(d)Online companies gather, analyze, and use data for behavioral marketing directed at children.

(e)Companies employ sophisticated strategies, including neuromarketing, to affect consumer behavior and manipulate decisionmaking.

(f)Branded content in various forms of multimedia, including native advertising and influencer marketing, exposes children to marketing that is inherently manipulative or purposely disguised as entertainment or other information.

1798.302.
 For purposes of this title, all of the following definitions apply: title:

(a)“Branded content” means commercial content created for, and distributed on, a platform in a manner that causes the difference between entertainment and advertising to become unclear for purposes of generating a positive view of the brand.

(a) “Content” means streaming media in which the data from a video file is continuously delivered via the internet to a remote user allowing a video to be viewed online without being downloaded on a host computer or device.
(b) “Covered user” means a natural person under 16 years of age who is a California resident, as defined in Section 17014 of Title 18 of the California Code of Regulations, as that section read on January 1, 2021. under 13 years of age who is a California resident and who is logged into an account that meets both of the following criteria:
(1) The account was created by the person’s parent or guardian.
(2) The account explicitly identifies the primary user as a person under 13 years of age based on information provided by the parent or guardian who created the account.

(c)“Directed to children” means the targeting of a covered user by a platform. Targeting is determined by examining all the following factors with respect to a platform:

(1)The platform’s subject matter.

(2)The platform’s visual content.

(3)Whether the platform uses animated characters or activities for covered users, as well as related incentives.

(4)The type of music or audio content used by the platform.

(5)The age of any models used.

(6)The use of celebrities under 16 years of age or celebrities who appeal to covered users.

(7)The language used by the platform.

(8)The type of advertising content used, or used to advertise, on the platform.

(9)Reliable empirical evidence relating to the composition of the platform’s users and the intended audience of the platform.

(d)“Host-selling” means commercial video content that features the same characters or individuals used in adjacent noncommercial content.

(e)“Native advertising” means a form of media paid for by the advertiser, where the advertising experience follows the natural form and function of the user experience in which it is placed.

(c) “Directed to children” means any of the following:
(1) Content that a reasonable person would believe was intended to appeal primarily to children under the age of 13.
(2) A channel consisting mainly of content that a reasonable person would believe was intended to appeal primarily to children under the age of 13.
(3) Content viewed by a covered user.
(f) “Operator” means any person that operates a platform, including any person offering products or services for sale through that platform, involving commerce within the State of California. state.

(g)“Person” means any individual, partnership, corporation, trust, estate, cooperative, association, or other entity.

(h) “Platform” means an internet website, online service, online application, or mobile application that is operated for commercial purposes. purposes and that provides content that is directed to children.

1798.303.
 (a) (1) An operator of a platform directed to children shall not incorporate any of the following features on any of its platforms: with respect to content viewable by a covered user without first obtaining consent from the parent or guardian of the covered user:

(1)

(A) An auto-play setting that, without input from the covered user, commences additional video content directly following the video content initially selected by the covered user.

(2)

(B) Push alerts that urge a covered user to spend more time engaged with the platform when the covered user is not actively using it. that are not for safety or security purposes.

(3)

(C) A display of the quantity of positive engagement or feedback that a covered user has received from other users.

(4)

(D) Any design feature or setting that disproportionately encourages a covered user, due to their age or inexperience, to make purchases, submit content, or spend more time engaging with the platform. allows a covered user to make purchases, submit content, or communicate with other individuals on the platform.

(5)Any feature that provides a covered user with badges or other visual award symbols based on elevated levels of engagement with the platform.

(2) An operator of a platform shall not display to a covered user advertising related to alcohol, tobacco, or products containing nicotine.

(b)An operator of a platform directed to children, or a platform for which the operator has constructive knowledge that covered users use the platform, shall not do either of the following:

(1)Amplify, promote, or encourage covered users’ consumption of videos or other forms of content that involve any of the following:

(A)Sexual material.

(B)Physical or emotional violence, including bullying.

(C)Adult activities, including gambling.

(D)Other dangerous, abusive, exploitative, or wholly commercial content.

(2)Fail to implement a mechanism for users to report suspected violations of any requirement under paragraph (1).

(c)An operator of a platform directed to children shall not do any of the following:

(1)Direct content that includes host-selling to covered users.

(2)Expose covered users to program-length advertisements.

(3)Direct branded content or native advertising to covered users.

(4)Direct online advertising or material with considerable commercial content involving alcohol, nicotine, or tobacco to covered users.

(5)Expose covered users to online advertising or material with considerable commercial content with any imbedded interactive elements that take advantage of covered users’ inexperience or credulity in noncommercial content directed to children.

(6)Direct content that includes product placement to covered users.

(d)For age verification purposes, an operator of a platform shall use only age verification information collected from covered users.

(b) An operator of a platform shall not, through content directed to children, promote, amplify, or otherwise encourage the consumption of content or advertising that involves any of the following:
(1) Sexual material.
(2) Physical or emotional violence, including bullying.
(3) Adult activities.
(c) An operator of a platform with content directed to children shall do all of the following:
(1) Allow a parent or guardian to create an account for that person’s child who is under 13 years of age.
(2) Provide a parent or guardian with parental controls that enable the parent or guardian to filter and block content viewable by the covered user for whom the parent or guardian created an account.
(3) Incorporate visual indicators that distinguish commercial content from noncommercial content.
(4) Publish and maintain a publicly accessible digital record of the content viewable or playable by a covered user.
(d) An operator of a platform shall implement a mechanism for users to report to the platform suspected violations of this section.

1798.304.

(a)The Attorney General shall adopt all of the following regulations:

(1)Regulations requiring any operator of a platform directed to children to incorporate visual indicators that distinguish commercial content from noncommercial content.

(2)Regulations requiring an operator of a platform that is directed to children to publish and maintain a publicly accessible digital record of the viewable or playable content of that platform.

(3)Regulations concerning an annual audit process for purposes of evaluating the level of compliance with this title. The annual audit shall be conducted for 25 platforms that are directed to children and that have the highest total number of covered users in the past calendar year. The first annual audit shall be completed by January 1, 2027.

(b)The Attorney General shall adopt the regulations described in subdivision (a) by January 1, 2026.

1798.304.
 The Attorney General shall do all of the following:
(a) (1) On or before June 1, 2027, and annually thereafter, conduct an audit of platforms to determine compliance with this title.
(2) The Attorney General shall audit 10 of the platforms that have the highest total number of covered users in the previous calendar year.
(3) The Attorney General may contract with a private entity to conduct, or assist with conducting, the audit required by this subdivision.
(b) Adopt regulations as necessary to implement this title.
(c) This section shall become operative on January 1, 2026.

1798.305.
 A violation of Section 1798.303 shall constitute unfair competition pursuant to Chapter 5 (commencing with Section 17200) of Part 2 of Division 7 of the Business and Professions Code.