(VII) The degree to which educators were have been involved in the decision to use artificial intelligence.
(ii) Anticipated and potential developments in
artificial intelligence technology in education.
(B) Conduct at least six public meetings to incorporate feedback from pupils, families, and relevant stakeholders into the assessment required by subparagraph (A).
(2) (A) Identify safe and effective uses benefits and risks associated with the use of artificial intelligence in education settings, including all of the following:
(i) The ethical, legal, and data privacy implications of artificial intelligence use in education.
(ii) Uses of artificial intelligence to
that support teaching and learning, including which how pupils may benefit from, and avoid harm from, artificial intelligence technology. from these technologies while avoiding harm.
(iii) Uses of artificial intelligence to that support the work of educators, including ways in which
how educators may benefit from, and
avoid harm from, artificial intelligence technology. from these technologies while avoiding harms such as deskilling.
(iv) Strategies to ensure pupils receive equitable pupil access to the benefits of artificial intelligence technology.
(v) Strategies to provide effective professional development for to educators on the use of
with respect to artificial intelligence technology.
(vi) The role that pupil and educator consent should play in the use of artificial intelligence technologies.
(vii) Strategies to ensure that pupil and educator feedback is continuously collected and considered as artificial intelligence technologies become more widely adopted.
(viii) The impact of artificial intelligence technologies on employment and labor dynamics within the education sector, including the relationship between job enhancement and replacement.
(ix) Strategies to ensure that the adoption of artificial intelligence does not exacerbate existing inequities
throughout the education system.
(B) In performing the work required by this subdivision, the working group shall solicit input from educators and pupils on their experience using the technologies identified in subparagraph (A).
(3) On or before January 1, 2026, develop guidance for local educational agencies and charter schools on the safe use of artificial intelligence in education that addresses all of the following:
(A) Academic integrity and plagiarism.
(B) Acceptable and unacceptable uses of artificial intelligence for pupils and educators.
(C) Pupil and teacher data
privacy and data security.
(D) Parent and guardian access to information that pupils enter into artificial intelligence systems.
(E) Procurement of software that ensures the safety and privacy of pupils and educators, and the protection of their data.
(F) Adoption of artificial intelligence technologies that augment educators’ ability to teach pupils.
(G) Strategies to ensure that the adoption of artificial intelligence technology does not exacerbate existing inequities throughout the education system.
(H) Strategies to ensure that educators receive adequate training, fair
compensation, and opportunities to offer feedback and guidance both individually and as a collective.
(4) On or before July 1, 2026, develop a model policy for local educational agencies and charter schools regarding the safe and effective use of artificial intelligence in ways that benefit, and do not harm, pupils and educators. This policy shall include all of the following topics:
(A) Academic integrity and plagiarism.
(B) Acceptable and unacceptable uses of artificial intelligence for pupils and educators.
(C) Pupil and teacher data privacy and data security.
(D) Parent and guardian
access to pupil information.
(E) Procurement of software that ensures the safety and privacy of pupils and educators and their data.
(F) Effective use of artificial intelligence to support, and avoid risk to, teaching and learning.
(G) Effective practices to support, and avoid risk to, educators.
(H) Strategies to ensure equitable access to the benefits of artificial intelligence technology.
(I) Professional development strategies for educators on the use of artificial intelligence.
(5) Identify other ways in which the state can support
educators in developing and sharing effective practices that minimize risk and maximize benefits to pupils and educators, including, but not limited to, establishing communities of practice on the use of artificial intelligence in education.
(6) On or before September 1, 2026, submit a report to the appropriate policy and fiscal committees of the Legislature, the Legislative Analyst’s Office, the state board, and the Department of Finance, in compliance with Section 9795 of the Government Code, on the process and products of the working group in meeting the requirements of this section, and any related findings or recommendations.
(e) The department shall post on its internet website the guidance developed pursuant to paragraph (3) of subdivision (d) and the model policy for
local educational agencies and charter schools developed pursuant to paragraph (4) of subdivision (d).
(f) Implementation of this act is contingent upon an appropriation by the Legislature for these purposes in the annual Budget Act or another statute.