22756.1.
(a) (1) Subject to paragraph (2), a deployer shall perform an impact assessment on any automated decision system before the system is first deployed and annually thereafter.(2) (A) With respect to an automated decision system that a deployer first used prior to January 1, 2025, the deployer shall perform an impact assessment on that automated decision system before January 1, 2026, and annually thereafter.
(B) This subdivision does not require a deployer to perform an impact assessment on an automated decision system before using it if all of the following are true:
(i) The deployer uses the automated decision system only for its intended use as determined by the developer of the automated decision system.
(ii) The deployer does not make any substantial modifications to the automated decision system.
(iii) The developer of the automated decision system has performed any impact assessment on the automated decision system required by subdivision (c).
(iv) The developer of the automated decision system has provided documentation to the deployer pursuant to Section 22756.3.
(b) A deployer shall ensure that an impact assessment prepared pursuant to subdivision (a) includes all of the following:
(1) A statement of the purpose of the automated decision system and its intended benefits, uses, and deployment contexts.
(2) A description of all of the following:
(A) The personal characteristics or attributes that the automated decision system will measure or assess.
(B) The method by which the automated decision system measures or assesses those attributes or characteristics.
(C) How those attributes or characteristics are relevant to the consequential decisions for which the automated decision system will be used.
(D) The automated decision system’s outputs.
(E) How outputs are used to make, or be a substantial factor in making, a consequential decision.
(3) A summary of the categories of information collected from natural persons and processed by the automated decision system when it is used to make, or be a substantial factor in making, a consequential decision, including, but not limited to, all of the following:
(A) Each category of personal information identified by reference to the applicable subparagraph enumerated under paragraph (1) of subdivision (v) of Section 1798.140 of the Civil Code.
(B) Each category of sensitive personal information identified by reference to the applicable paragraph and subparagraph enumerated under subdivision (ae) of Section 1798.140 of the Civil Code.
(C) Each category of information related to a natural person’s receipt of sensitive services, as defined in Section 56.05 of the Civil Code, identified by reference to the specific category of sensitive service enumerated in the definition.
(4) A statement of the extent to which the deployer’s use of the automated decision system is consistent with or varies from the statement required of the developer by Section 22756.3.
(5) An analysis of the risk of algorithmic discrimination, including adverse impacts on the basis of sex, race, color, ethnicity, religion, age, national origin, limited English proficiency, disability, veteran status, genetic information, or any other classification protected by state or federal law, resulting from the deployer’s use of the automated decision system.
(6) A description of the safeguards implemented, or that will be implemented, by the deployer to address any reasonably foreseeable risks of algorithmic discrimination arising from the use of the automated decision system. The description shall address all of the following:
(A) Whether the automated decision system could be modified to mitigate the risk of algorithmic discrimination.
(B) Whether effective accommodations can be provided for any limitations on accessibility.
(C) Whether less discriminatory procedures or methods could be employed to mitigate the risk of algorithmic discrimination.
(7) A description of how the automated decision system will be used by a natural person, or be monitored when it is used autonomously, to make, or be a substantial factor in making, a consequential decision.
(8) A description of how the automated decision system has been or will be evaluated for validity, reliability, and relevance.
(c) (1) Subject to paragraph (2), a developer, before making an automated decision system that it designs, codes, or produces available to potential deployers, shall perform an impact assessment on the automated decision system and annually thereafter.
(2) With respect to an automated decision system that a developer first made available to potential deployers before January 1, 2025, the developer shall perform an impact assessment on the automated decision system before January 1, 2026, and annually thereafter.
(d) A developer shall ensure that an impact assessment prepared pursuant to subdivision (c) includes all of the following:
(1) A statement of the purpose of the automated decision system and its intended benefits, uses, and deployment contexts.
(2) A description of the automated decision system’s outputs and how they are used to make, or be a substantial factor in making, a consequential decision.
(3) A summary of the categories of information collected from natural persons and processed by the automated decision system when it is used to make, or be a substantial factor in making, a consequential decision, including, but not limited to, all of the following:
(A) Each category of personal information identified by reference to the applicable subparagraph enumerated under paragraph (1) of subdivision (v) of Section 1798.140 of the Civil Code.
(B) Each category of sensitive personal information identified by reference to the applicable paragraph and subparagraph enumerated under subdivision (ae) of Section 1798.140 of the Civil Code.
(C) Each category of information related to a natural person’s receipt of sensitive services, as defined in Section 56.05 of the Civil Code, identified by reference to the specific category of sensitive service enumerated in the definition.
(4) An analysis of the risk of algorithmic discrimination, including adverse impacts on the basis of sex, race, color, ethnicity, religion, age, national origin, limited English proficiency, disability, veteran status, genetic information, or any other classification protected by state or federal law, resulting from the deployer’s use of the automated decision system.
(5) A description of the measures taken by the developer to mitigate the risk of algorithmic discrimination arising from the use of the automated decision system.
(6) A description of how the automated decision system can be used by a natural person, or be monitored when it is used autonomously, to make, or be a substantial factor in making, a consequential decision.
(7) A description of how the automated decision system has been evaluated for validity, reliability, and relevance.
(e) A deployer or developer shall perform, as soon as feasible, an impact assessment with respect to any substantial modification to an automated decision system.
(f) This section does not apply to a deployer with fewer than 55 employees unless the deployer used an automated decision system that impacted more than 999 people during the previous calendar year.