Loading chat...
CA AB1137
Bill
Status
2/2/2026
Primary Sponsor
Maggy Krell
Click for details
AI Summary
-
Expands California's child sexual abuse material (CSAM) reporting requirements for social media platforms by removing the requirement that the reporting user must be depicted in the material, and requires the depicted individual be shown as a minor
-
Mandates social media platforms review all CSAM reports through hash matching technology, with human review required when no hash match exists and the material is not otherwise blocked
-
Imposes civil penalties up to $250,000 per day on social media companies whose CSAM reporting mechanisms are unavailable or nonfunctional, enforceable by the Attorney General, district attorneys, city attorneys, or county counsel
-
Allows depicted individuals to sue noncomplying social media companies for actual damages plus statutory damages up to $250,000 per violation, with reduced caps for platforms that block material before litigation or participate in the National Center for Missing and Exploited Children's Take It Down service
-
Requires social media platforms to submit to twice-yearly independent third-party audits and publicly release audit reports (with trade secret redactions) to qualify for an exemption from liability for facilitating commercial sexual exploitation of minors
Legislative Description
Reporting mechanism: child sexual abuse material.
Last Action
From committee: Filed with the Chief Clerk pursuant to Joint Rule 56.
2/2/2026