Oregon’s AI Chatbot Bill: A Preview of the Next Regulatory Wave in AI Governance
Artificial Intelligence

Oregon’s AI Chatbot Bill: A Preview of the Next Regulatory Wave in AI Governance

Oregon’s chatbot bill, SB 1546, is now waiting for Gov. Tina Kotek, D-Ore., to be signed into one of the most consequential pieces of legislation in AI governance. Seeking to avoid making the “same mistakes with AI that [were] made with social media,” the lawmakers have added some significant provisions in the bill. Their effect is also compounded by SB 1546’s private right of action for statutory damages. Whether seen as long-overdue protection for users or a gateway to unpredictable legislation, the bill is drawing close attention from business leaders. 

AI chatbots, or any such AI systems capable of influencing human behavior, were always expected to push legislators and businesses into such provocative governance brainstorming. Therefore, Oregon’s bill might very well be a preview of where AI regulation is headed and what consumer organizations must be prepared for. 

Governing Machine Indulgence 

After clearing the Senate in a decisive 28–2 vote, the bill now sits on Governor Tina Kotek’s desk, where she has five days to decide its fate. Once enacted, the resulting law will have the following provisions: 

  • Mandatory disclosure: Chatbots must clearly inform users that they are speaking with an artificial system rather than a human. The lawmakers see the absence of such disclosure as manipulation and view transparency as the first line of defense. 
  • Baseline safety measures for all users: The bill requires operators to introduce safety mechanisms such as periodic break reminders and safeguards against design features that intentionally promote compulsive or addictive engagement. This provision echoes the lessons regulators believe were ignored during the rise of social media. 
  • Heightened protections when minors are suspected: If a company has “reason to believe” a user is a minor, additional safeguards must be activated. The broad wording of this provision has made it a topic of debate among experts and critics. The bill places the burden on operators to exercise caution whenever signals suggest a younger audience may be involved. 
  • Restrictions targeting addictive AI behaviors: The bill directly challenges algorithmic designs that encourage excessive dependency on chatbot interactions, reflecting a growing regulatory belief that AI companions can influence behavior in ways similar to, or potentially stronger than, traditional social platforms. 
  • A private right of action for affected individuals: Perhaps the most consequential provision: individuals who suffer an “ascertainable loss” or other tangible harm may bring lawsuits against companies that violate these safety requirements. In effect, enforcement is no longer limited to regulators; the public itself becomes a potential line of accountability. 

Anticipating the Unpredictable 

All controversies aside, the Oregon bill does introduce concepts that are difficult to translate into product design. It is this complexity that the plaintiffs are likely to target, as we saw with CCPA litigations. Here’s how businesses can prepare: 

  • Maintain an AI inventory: Organizations cannot govern what they cannot see. Companies should maintain a current inventory of AI systems used across products and services, including embedded models, third-party tools, and experimental deployments. This inventory becomes the starting point for risk assessments, governance controls, and regulatory response. 
  • Strengthen notice, disclosure, and recordkeeping: Beyond simply informing users they are interacting with AI, companies should ensure disclosures are consistent across interfaces and that interactions and governance decisions are recorded. Clear notice paired with reliable recordkeeping helps demonstrate compliance and provides defensibility if litigation arises. 
  • Prioritize Transparency by design: Ensure chatbot interfaces clearly disclose that users are interacting with AI. This may seem simple, but regulators increasingly view disclosure as the foundation of responsible AI interaction, and it is often the first thing plaintiffs examine in litigation. 
  • Audit engagement mechanics: Review whether product features encourage excessive or compulsive interaction. Governance teams should work with product designers to identify patterns that could be interpreted as addictive engagement and document the safeguards introduced to mitigate them. 
  • Introduce guardrails for younger users: Companies should establish reasonable methods to identify when users may be minors and activate stricter safety measures in those scenarios. Even imperfect signals, such as age declarations or contextual indicators, can demonstrate that the company made a good-faith effort. 
  • Document governance decisions: In a world with private lawsuits, documentation becomes as important as the safeguard itself. Organizations should maintain records of risk assessments, safety reviews, and design choices that show how potential harms were considered and mitigated. 
  • Bring legal, product, and AI teams into the same room: Laws like this blur the line between engineering and compliance. Effective organizations will treat AI governance as a cross-functional discipline rather than a late-stage legal review. 

Liability of Conversations 

Oregon’s SB 1546 is making headlines not only for the provisions it introduces, but also for the precedent it signals. By targeting behavioral influence, addictive engagement, and transparency in machine conversations, the bill pushes AI governance into new territory. For consumer-facing businesses, this marks a shift toward tighter expectations around product design, governance, and legal accountability. Organizations deploying AI systems will need to adapt quickly to navigate what is becoming an increasingly complex and fast-evolving landscape of AI regulation. 


Author

Dan Clarke
Dan Clarke
President, Truyo
March 12, 2026

Let Truyo Be Your Guide Towards Safer AI Adoption

Connect with us today