Harassing women on the Internet is nothing new, but the rise of powerful AI tools means the barriers to entry have pretty much vanished. A teenager with a laptop and a prompt can now create a convincing nude of a classmate before lunch—and share it with the entire school before the final bell. That dizzying shift, from niche corners of Reddit to every timeline on X or Instagram, is why journalist Kat Tenbarge joined host Cory Corrine on The Intersect.
Corrine frames the problem in stark terms, warning that “to be a woman and exist online right now means living with the constant possibility that your image can be taken, twisted, and used against you.” Tenbarge, who has covered tech-enabled abuse for years, agrees, noting, “what we’re seeing with nonconsensual AI content isn’t trolling. It’s actually widespread targeted harassment.” The pair isn’t just saying this for clicks; they’re sounding an alarm about a threat that moved from shadowy forums to mainstream feeds in record time.
Tenbarge also spotlights sparks of resistance—fans flooding report buttons, school districts taking a public stand—that prove collective outrage can still outpace code. So, how does AI supercharge online abuse? And how is real-world pushback finally gaining some traction?
Tenbarge traces today’s crisis back to 2018, when people “started seeing [deepfakes] on platforms like Reddit.” Back then, creators needed serious coding chops. Now, she explains, “as generative AI evolved and became more sophisticated, it became easier for people to make this type of content.”
The proof? Apps that promise to “undress anyone” with a single upload, plus the Brooke AB incident that went viral after trolls asked X’s chatbot Grok to splash “glue” across the gamer’s selfie, turning it into a sexually explicit meme in seconds.
When victims stumble across doctored nudes of themselves, Tenbarge says, “your brain doesn’t know that it’s not real” and processes the image like an actual assault. The trauma doesn’t stop at shock—perpetrators often “send this material to their boss, using it to humiliate women at scale.” From reputation damage to lost gigs, the ripple effect can trash a career long before HR verifies the image is fake.
Corrine spells out the core problem: “the scale of synthetic, non-consensual sexual content is moving faster than platforms or laws can respond.”
Even landmark legislation such as the federal Take It Down Act offers little comfort. Tenbarge notes that civil claims for “reputational damages” or “infliction of emotional distress” require victims to unmask anonymous offenders, which is “much easier said than done” online. Suing the AI model itself is a legal twilight zone.
Meanwhile, X’s quick fix after the Brooke AB fiasco was to block the word “glue,” a tactic Corrine calls “very hand-to-hand combat” that hardly addresses the bigger issue.
Still, hope bubbles up whenever women come together to organize. Tenbarge recalls how “fans of Taylor Swift were rallying together and reporting the images en masse” when fake AI porn of the singer exploded on X, forcing the platform to disable Swift search results temporarily. She sees similar momentum in smaller communities. “Women are saying this is unacceptable,” she says. “We’re gonna redefine the social norms around this behavior.”
That culture-first approach might be the fastest fix. “If, as a culture, we agree that this is unacceptable and that there will be consequences,” Tenbarge says, “I think that gives me a sense of optimism that we can actually do something about this”.
Corrine closes the episode on a rallying note, saying, “The most effective resistance she’s seen has not come from top-down orders, but from women themselves”. Technology will continue to advance, and legislation will continue to play catch-up. But every mass report, every public statement, every school board that says “not on our watch” chips away at the idea that AI-powered abuse is just the cost of being online.
In other words, the robots may be tireless, but so are women on the Internet—and women fight back with receipts, solidarity, and a hard stop on shame.
Leave a Reply