Congress Moves on AI Chatbot Safety: What It Means for You

Congress Moves on AI Chatbot Safety: What It Means for You

March 13, 2026 · Martin Bowling

Federal AI chatbot regulation just cleared its biggest hurdle

Congress is no longer watching from the sidelines. The House Energy and Commerce Committee advanced the Kids Internet and Digital Safety (KIDS) Act in a 28-24 vote, sending the most comprehensive federal AI chatbot safety package to the full House floor. Combined with the Senate’s unanimous passage of updated children’s privacy protections, federal regulation of AI chatbots is closer to reality than it has ever been.

If you run a small business with an AI chatbot, intake widget, or automated customer assistant on your website, this matters. We already covered the 78 state-level chatbot bills moving through legislatures. Now Congress is layering federal rules on top.

What the KIDS Act requires

Chairman Brett Guthrie (R-Ky.) called the KIDS Act “the most serious, comprehensive piece of legislation to address online safety to date.” It rolls together elements from multiple earlier proposals into a single package. Here are the requirements that matter for businesses:

Mandatory AI disclosure

Any chatbot that interacts with users under 17 must clearly state it is an AI system, not a human. This disclosure is required at the start of the first interaction and any time a user asks whether they’re talking to a real person.

Crisis intervention protocols

If a minor mentions suicide or self-harm, the chatbot must surface resources for the 988 Suicide and Crisis Lifeline. This is not optional — it is a specific legal obligation.

Break prompts after three hours

Chatbot providers must encourage minors to take a break after three hours of continuous interaction. California already requires something similar under SB 243, which took effect January 1, 2026.

No impersonating licensed professionals

AI chatbots cannot claim to be doctors, therapists, or other licensed professionals unless that claim is true. This targets companion chatbots that roleplay as mental health providers, but the language is broad enough to apply to any business chatbot.

Age verification at the app store level

The accompanying App Store Accountability Act would push age verification responsibilities to app stores rather than individual businesses. For web-based chatbots — the kind most small businesses use — this provision does not directly apply, but it signals where federal enforcement is heading.

Who this actually affects

The KIDS Act primarily targets companion chatbot platforms — the apps designed for ongoing social interaction with AI personas. A teenager’s suicide following conversations with a companion chatbot catalyzed much of this legislation, and the rules are written with that use case in mind.

The Act exempts providers whose chat functions are “incidental to the primary purpose of their service.” That is good news for most small businesses. If you run a plumbing company and your website has an AI intake widget that schedules appointments, you are probably not the target.

But here is the catch: the law does not define “incidental” with precision, and enforcement is handled by the FTC and state attorneys general. The safest approach is to comply with the disclosure requirements regardless of whether you think the law directly applies to you. Telling users they are talking to AI is good practice anyway.

The bigger regulatory picture

Federal legislation is only part of the story. As of early 2026, 78 chatbot-specific bills have been introduced across 27 states, and 58 related lawsuits are already in progress. A few examples:

  • Oregon SB 1546 requires chatbot disclosure and includes a private right of action with $1,000 in statutory damages per violation — meaning individual users can sue.
  • Florida SB 482 prohibits AI platforms from letting minors create accounts without parental consent.
  • Tennessee SB 1580 passed the state Senate unanimously, banning AI systems from representing themselves as mental health professionals.

President Trump’s December 2025 executive order on AI policy explicitly exempts child safety protections from federal preemption. That means even if a federal law passes, state laws can still add stricter requirements. Small businesses operating in multiple states need to track the strictest applicable rules.

What you should do now

You do not need to panic. Most small business chatbots are simple tools — scheduling, intake, FAQ answers — that fall well outside the companion chatbot use cases these laws target. But preparation is cheap, and compliance surprises are not.

Three steps to take this month

  1. Add a clear AI disclosure. Make sure your chatbot tells users it is an AI system at the start of every conversation. This is already required in California and will likely become a federal standard. A simple opening message works: “Hi, I’m an AI assistant for [your business]. How can I help?”

  2. Review your chatbot’s scope. If your AI assistant gives health advice, legal guidance, or anything that could be confused with professional counsel, rewrite those responses to include clear disclaimers. The KIDS Act bans AI from impersonating licensed professionals, and state laws are following suit.

  3. Watch your state. Check whether your state has introduced chatbot legislation. The Transparency Coalition publishes weekly updates tracking every AI bill in the country. If your state passes something like Oregon’s private right of action, the cost of non-compliance jumps significantly.

Resources to track

The bottom line

AI chatbot regulation is no longer theoretical. The KIDS Act is headed for a full House vote. Dozens of state bills are advancing. And the enforcement framework — FTC oversight plus state attorney general action — means violations carry real consequences.

The good news is that compliance for most small businesses is straightforward: disclose that your chatbot is AI, do not let it impersonate professionals, and build in basic safety guardrails. If your AI-powered intake tool already follows those principles, you are ahead of most businesses. If it does not, now is the time to fix it before the rules take effect.

AI Tools Industry News Small Business Chatbots