On August 1, 2025, Governor JB Pritzker signed the Wellness and Oversight Resources Act (the Act) into law. The Act, which was unanimously passed by each chamber of the Illinois General Assembly and is effective immediately, curtails the use of AI by licensed behavioral health professionals in the provision of therapy services, prohibiting its use for purposes of mental health and therapeutic decision making, while allowing its use for administrative and support services.
Overview
Facilitated through the joint efforts of the Illinois Department of Financial and Professional Regulation (IDFPR), the Illinois General Assembly, and the Illinois Chapter of the National Association of Social Workers, the Act aims to address increased concerns arising from the usage of AI products, including chatbots, in the provision of mental health services.
The Act prohibits “licensed professionals,” defined as individuals holding an Illinois license to provide therapy or psychotherapy services, but notably excluding physicians, from using AI products to (i) make independent therapeutic decisions, (ii) directly interact with clients in any form of therapeutic communication, (iii) generate treatment plans or recommendations without the review and approval of the licensed provider, or (iv) detect a client’s emotions or mental states.
Behavioral health providers may still use AI systems for administrative support functions, such as appointment scheduling, bill and claim processing, and drafting general communications that do not include therapeutic advice. The Act also permits the use of AI for supplementary support, such as tasks that assist a licensed professional in the delivery of their therapy services (e.g., preparing therapy notes). Such uses are permitted as long as the provider maintains full responsibility for the interactions, outputs, and data use associated with the AI system. Furthermore, supplementary support can only be used for therapy or psychotherapy when a client’s therapeutic session is recorded or transcribed, and the patient’s consent meets the required elements identified in the Act. Providers should take the above requirements into consideration when using AI systems developed by third parties, as well as when considering the implementation of an AI tool in their practice and the appropriate training of staff on the use of such a tool.
The Act does not apply to religious counseling, peer support, self-help materials, and educational resources available to the public that do not claim to offer therapy services.
Enforcement mechanisms
The IDFPR will have the authority to investigate any actual, alleged, or suspected violation of the Act. Confirmed violations will result in civil penalties up to $10,000 per violation. The IDFPR will assess penalties based on the degree of harm and the circumstances of the violation.
Unintended consequences?
Although not limited to AI-generated materials, the Act contains a confidentiality provision that requires that records kept by a behavioral health professional, as well as communications between an individual seeking therapy or psychotherapy services and the licensed professional, must be kept confidential and not disclosed except as required under the Illinois Mental Health and Developmental Disabilities Confidentiality Act (Confidentiality Act). As “record” and “communication” are not defined in this Act, these terms may be interpreted more broadly than the definition of “record” in the Confidentiality Act, which provides for exceptions for a therapist’s personal notes maintained outside of the medical record, de-identified information, and references to the receipt of mental health or developmental disabilities services noted in a patient’s history or care summary. In addition, a significant amount of the disclosures addressed in the Confidentiality Act are permissive disclosures, not required disclosures; providers regulated under the Confidentiality Act are permitted, but not required, to make such disclosures. This new Act, by requiring the records to be confidential and only disclosed when required under the Confidentiality Act, may significantly limit a behavioral health provider’s ability to disclose this data for purposes otherwise permitted by the Confidentiality Act.
Takeaways
Behavioral health providers will need to be mindful when implementing AI strategies and products in their practices. While AI systems may be leveraged for operational efficiency in the form of administrative support or supplementary support functions, appropriate safeguards will need to be implemented to ensure that AI systems do not engage in any prohibited functions, including therapeutic communications and decision making. Providers and their teams will also need to ensure they maintain full responsibility for the use of AI products associated with such permitted functions. This includes carefully reviewing and understanding what functions AI products perform and how they were developed. Behavioral health providers will also need to ensure that safeguards are in place to protect the confidentiality of records and communications in accordance with both the Confidentiality Act and the Act, including the new limitations on permissive disclosures.
For more information on the content of this alert, please contact your Nixon Peabody attorney or the authors of this alert.