1798.303.
(a) (1) An operator of a platform shall not incorporate any of the following features with respect to content viewable by a covered user without first obtaining consent from the parent or guardian of the covered user:(A) An auto-play setting that, without input from the covered user, commences additional video content directly following the video content initially selected by the covered user.
(B) Push alerts that are not for safety or security purposes.
(C) A display of the quantity of positive engagement or feedback that a covered user has received from other users.
(D) Any design feature or setting that allows a covered user to make purchases, submit content, or communicate with other individuals on the platform.
(2) An operator of a platform shall not display to a covered user advertising related to alcohol, tobacco, or products containing nicotine.
(b) An operator of a platform shall not, through content directed to children, promote, amplify, or otherwise encourage the consumption of content or advertising that involves any of the following:
(1) Sexual material.
(2) Physical or emotional violence, including bullying.
(3) Adult activities.
(c) An operator of a platform with content directed to children shall do all of the following:
(1) Allow a parent or guardian to create an account for that person’s child who is under 13 years of age.
(2) Provide a parent or guardian with parental controls that enable the parent or guardian to filter and block content viewable by the covered user for whom the parent or guardian created an account.
(3) Incorporate visual indicators that distinguish commercial content from noncommercial content.
(4) Publish and maintain a publicly accessible digital record of the content viewable or playable by a covered user.
(d) An operator of a platform shall implement a mechanism for users to report to the platform suspected violations of this section.