Today's Law As Amended

PDF |Add To My Favorites | print page

AB-2269 Personal rights: automated decision systems.(2019-2020)

As Amends the Law Today

 The Legislature finds and declares all of the following:
(a) State law protects the rights of all persons in a variety of contexts without discrimination on account of certain protected characteristics, such as on the basis of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, or marital status, among other characteristics, as described in Section 12940 of the Government Code.
(b) The rise of big data has raised concerns about the use of algorithmic or automated decision systems to make hiring and other workplace decisions, eligibility decisions, insurance eligibility, lending decisions, and marketing decisions quickly, automatically, and fairly.
(c) If the underlying data used for an algorithm or automated decision system is biased, incomplete, or discriminatory, the decisions made by using such devices has the potential to result in massive inequality.
(d) The state has a legitimate and substantial interest in ensuring that automated decision systems used do not result in discrimination.
(e) Therefore, the Legislature finds that it is necessary to require a review of the use of algorithmic decision systems also known as automated decision systems (ADS) in order to detect and prevent discrimination.

SEC. 2.

 Title 1.81.8 (commencing with Section 1798.400) is added to Part 4 of Division 3 of the Civil Code, to read:

TITLE 1.81.8. Automated Decision Systems Accountability Act of 2020

 This act shall be known and may be cited as the Automated Decision Systems Accountability Act of 2020.
 For the purposes of this title, the following definitions apply:
(a) “Automated decision system” or “ADS” means a computational process, including one derived from machine learning, statistics, or other data processing or artificial intelligence techniques, that makes a decision or facilitates human decision making, that impacts persons.
(b) “Automated decision system impact assessment” or “ADS impact assessment” means a study evaluating an ADS and the ADS’s development process, including, but not limited to, the design and training data of the ADS, for impacts on accuracy, fairness, bias, discrimination, privacy, and security that includes, at a minimum, all of the following:
(1) A detailed description of the ADS, its design, training provided on its use, its data, and its purpose.
(2) An assessment of the relative benefits and costs of the ADS in light of its purpose, taking into account relevant factors, including, but not limited to, all of the following:
(A) Data minimization practices.
(B) The duration for which personal information and the results of the ADS are stored.
(C) What information about the ADS is available to consumers.
(D) The extent to which consumers have access to the results of the ADS and may correct or object to its results.
(E) The recipients of the results of the ADS.
(F) An assessment of the risks posed by the ADS to the privacy or security of personal information of consumers and the risks that the ADS may result in or contribute to inaccurate, unfair, biased, or discriminatory decisions impacting consumers.
(G) The measures the business will employ to minimize the risks, including technological and physical safeguards.
(c) “Business” means a digital or software company that creates or distributes an ADS.
(d) “Department” means the Department of Business Oversight.
(e) “Person” means an individual, firm, association, organization, partnership, limited liability company, business trust, corporation, or public entity of any kind.
 (a) A business in California that provides a person with a program or device that uses an ADS shall do all of the following:
(1) Take affirmative steps to ensure that there are processes in place to continually test for biases during the development and usage of the ADS.
(2) Conduct an ADS impact assessment on its program or device that uses an ADS to do all of the following:
(A) Determine whether the ADS under review has a disproportionate adverse impact on a protected class, as described in Section 12940 of the Government Code. A business may contract with a third party to independently create the ADS impact assessment for the purpose of providing an additional level of credibility.
(B) Examine if the ADS in question serves reasonable objectives and furthers a legitimate interest.
(C) Compare the ADS to alternatives or reasonable modifications that may be taken to limit adverse consequences on protected classes.
(3) On or before March 1, 2022, and annually thereafter, a business shall submit a report to the department, in a format developed by the department pursuant to subdivision (c), summarizing the results of its ADS impact assessment for each program or device that uses an ADS. If a business makes any significant modification to an ADS, the business shall reconduct an ADS impact assessment and resubmit the results of that assessment to the department no later than 60 days from the modification.
(b) On or before January 1, 2022, the department shall develop a procedure, including a form, if necessary, for businesses to use in making the reports required pursuant to this section. The department also shall make general information on the reporting process accessible on its internet website on or before January 1, 2022.
(c) If a business fails to comply with this section, the department shall send a written notice to the business of its failure to comply. The business shall have 60 days from the date of the written notice in which to comply, by completing the report and submitting it to the department. Failure by a business to submit the required report shall result in a civil penalty.
 On or before March 1, 2022, the department shall establish an Automated Decision Systems Advisory Task Force for the purpose of reviewing and providing advice on the use of automated decision systems in businesses, government, and various other settings. The task force shall consist of all of the following:
(a) Two representatives from advocacy organizations that represent consumers or protected classes of communities, as described in Section 12940 of the Government Code.
(b) Two members from state or local government agencies.
(c) Two representatives from digital or software companies who use or create automated decision systems.
(d) Two representatives from universities or research institutions with expertise in automated decision systems.