Dayton Aldrich | Credit: Courtesy

Santa Barbara authorities have successfully prosecuted a 35-year-old city resident under a new California law that criminalizes the creation, possession, and distribution of AI-generated child sexual abuse materials (CSAM). 

The case against Dayton Aldrich, who worked for the City of Santa Barbara as a community development specialist when he was arrested in August 2025, is among the first on the Central Coast to utilize the new legislation.

Assembly Bill 1831 was drafted in response to the emergence of deepfake technology and “nudify” apps capable of creating images indistinguishable from real photographs. It closed a legal loophole by including AI-generated materials under existing state penal code for CSAM and is meant to combat the subsequent explosion of illicit content across the dark web.

In 2023, the National Center for Missing and Exploited Children (NCMEC) received 4,700 reports from electronic service providers of AI-generated CSAM. In 2024, the number of tips jumped to 67,000. And in 2025, that figure skyrocketed to 1.5 million. Forty-five states now have laws on the books similar to California’s.

According to Aldrich’s arrest affidavit, Santa Barbara police received information from NCMEC that “a Santa Barbara based account was engaged in a sexually explicit text conversation with a North Carolina user regarding sexual molestation of high school students.” Investigators quickly traced the conversation to Aldrich and the internet address at his Bath Street apartment.

A review of Aldrich’s chat history on Kik, a mobile messaging app where he went by the username “onlyplayingaroundvvv,” “clearly displayed the suspect’s unnatural interest in minors,” the affidavit states. Lead investigator Eric Davis noted that Aldrich had previously worked for the Santa Barbara District Attorney’s Office as a victim program assistant and that he “immediately recognized him in the selfies from Kik.”

A full review of the content that Aldrich had created and shared uncovered “multiple images of CSAM,” Davis wrote. “For two of the victims, I was able to determine their real identities and confirm they were minors at the time of the photographs that Aldrich used to make AI-generated pornography of them.” One of the victims is a former child actress, and the other is a prolific member of the “TeenTok” community, a subculture of content created by and for teenagers on TikTok.

At the time of his arrest, Aldrich was the board president of the Winchester Canyon Gun Club and was known to keep multiple firearms at his apartment. Police surprised him in the middle of the night and took him into custody without incident. They seized more than 20 registered guns, high-capacity magazines, body armor, and approximately 10,000-15,000 rounds of ammunition.

Though Aldrich initially faced a number of felony charges, he ultimately pleaded guilty to a single count of possessing CSAM. With no prior criminal history, he was sentenced to a year in county jail, two years of probation, and a mandatory lifetime sex offender registration. Shortly after his arrest, Aldrich was fired from his city job and voted off the board of the gun club.

Assembly Bill 1831 was authored by Assemblymember Marc Berman from Northern California and co-sponsored by Ventura County District Attorney Erik Nasarenko. During trips to Sacramento to advocate for the legislation, they were accompanied by 16-year-old Ventura County resident Kaylin Hayman, a former Disney actress whose face was used by a Pennsylvania man to create explicit images and videos. “Advocating for this bill has been extremely empowering, and I am grateful to the DA’s office as well as my parents for supporting me through this process,” Hayman said at the signing. “This law will be revolutionary, and justice will be served to future victims.”

Login

Please note this login is to submit events or press releases. Use this page here to login for your Independent subscription

Not a member? Sign up here.