Loading chat...
RI S2195
Bill
AI Summary
-
Requires AI companion operators to implement protocols addressing user expressions of suicidal ideation, self-harm, potential physical harm to others, and potential financial harm to others, including referrals to crisis services such as suicide hotlines
-
Mandates operators provide notifications at the start of each interaction and every 3 hours stating in bold, capitalized 16-point type that the AI companion is a computer program incapable of human emotion
-
Defines "AI companion" as systems using AI and emotional recognition algorithms to simulate social interaction, excluding business customer service chatbots
-
Creates private right of action in superior court for individuals physically or financially harmed due to violations, and empowers the Attorney General to investigate and seek injunctions against noncompliant providers
-
Takes effect January 1, 2027
Legislative Description
Creates additional safety features for AI companion technology that include addressing suicidal ideation, potential physical harm or financial harm to others expressed by a user. It also requires notification the AI companion does not have human emotions.
Commercial Law
Last Action
Introduced, referred to Senate Artificial Intelligence & Emerging Technol
1/23/2026