
A little over a year ago, MG was leading the relatively normal life of a twentysomething in Scottsdale, Arizona. She worked as a personal assistant and supplemented her income by waiting tables on the weekends. Like most women her age, she had an Instagram account, where she’d occasionally post Stories and photos of herself getting matcha and hanging out by the pool with her friends, or going to Pilates.
“I never really cared to pop off and become popular on social media,” says MG (who is cited only as MG in the lawsuit to protect her identity). “I just used it the way most people did when it first came out, to share their lives with the people closest to them.” She has a little more than 9,000 followers—a robust following, but nowhere close to a massive platform.
Last summer, she received a DM from one of her followers. Did she know, the person asked her, that photos and videos of a woman who looked exactly like MG were circulating on Instagram? MG clicked the link and saw multiple Reels of what appeared to be her face superimposed onto a body that looked exactly like her own. The woman in the photo was scantily clad, with tattoos in the same places as MG.
MG was horrified. “If you didn’t know me well, you could very well think they were images of me,” she said. “It was kind of like this reality check that I don’t have any control over my own image.”
She was even more appalled when she discovered that not only were doctored nude or scantily clad photos of her being circulated on the internet, as she outlined in a recently filed complaint—they were also being used to advertise AI ModelForge, a platform that teaches men how to generate their own AI influencers. In a series of online classes and tutorials, the men allegedly taught subscribers to use a software called CreatorCore to train AI models using photos of unsuspecting young women, posting the resulting content on Instagram and TikTok.
“They provided a whole playbook, including instructions on how to pick the right person so that it’s not someone who can defend themselves, so they all had instructions on what type of women to use and where to get their pictures,” she claims. “It was disgusting on every single level.”
MG is one of three plaintiffs in a lawsuit filed in January in Arizona against three Phoenix men: Jackson Webb, Lucas Webb, and Beau Schultz, as well as 50 other John Does. The lawsuit alleges that the Webbs and Schultz scoured the internet for photos of unsuspecting young women, then used AI to generate photos and videos of fictional models who look exactly like them, selling such content on the subscription platform Fanvue.
The suit further alleges that for $24.95 a month on the platform Whop, the men sold courses online training other men, including the John Does named in the suit, how to make their own AI-generated influencers based on real women’s photos. The men allegedly created “Blueprints” for how to scrape images from women’s social media accounts and feed them into the generative AI model on CreatorCore, as well as a separate app that would remove the women’s clothes and generate sexually explicit images and videos. Such content, the suit claims, generated millions of views, reportedly generating more than $50,000 in income in one month. (The Webbs and Schultz did not respond to requests for comment.)
This moneymaking scheme, the complaint alleges, preyed on a “harem of indistinguishable AI copies of unsuspecting women and girls,” as well as instructing “predators seeking to prey on” women on social media. According to the suit, in 2025 the CreatorCore platform had more than 8,000 subscribers generating their own AI influencers, resulting in more than 500,000 images and videos.