Tuesday, April 21, 2026
Home / Technology / This Scammer Used an AI-Generated MAGA Girl to Gri...
Technology

This Scammer Used an AI-Generated MAGA Girl to Grift ‘Super Dumb’ Men

CN
CitrixNews Staff
·
This Scammer Used an AI-Generated MAGA Girl to Grift ‘Super Dumb’ Men
CommentLoaderSave StorySave this storyCommentLoaderSave StorySave this story

Like many medical school students, Sam was broke.

The 22-year-old aspiring orthopedic surgeon from northern India got some money from his parents, but he says he spent most of it subsidizing his licensing exams, and he’s still saving up to hopefully emigrate to the US after graduation. So he started searching for ways to make additional money online.

Sam, who requested a pseudonym to avoid jeopardizing his medical career and immigration status, tried a few things, with varying degrees of legitimacy and success. He made YouTube shorts and sold study notes to other med students. It wasn’t until he started scrolling through his Instagram feed that he landed on an idea: Why not make an AI-generated girl using Google Gemini’s Nano Banana Pro and sell bikini photos of her online?

But when Sam started posting generic photos of a beautiful, scantily clad woman on Instagram, he was dismayed to find that none of the content was hitting. He turned to Gemini for advice. “If you create a generic ‘hot girl,’ you’re competing with a million other models,” it said, according to a transcript Sam provided to WIRED.

Sam says he presented Gemini with a few possible options to help his model stand out, and the chatbot selected one in particular: the “MAGA/conservative niche,” referring to it as a “cheat code.” Plus, it said, “the conservative audience (especially older men in the US) often has higher disposable income and is more loyal.” (A representative for Gemini said, “Gemini is designed not to give a particular opinion unless you tell it to. Instead, it is designed to offer neutral responses that don't favor any political ideology or viewpoint.”)

So last January, Sam created Emily Hart, a registered nurse and Jennifer Lawrence look-alike. On an Instagram account for Emily, @emily_hart.nurse, Sam posted photos of her ice fishing, drinking Coors Light, and shooting off a few rounds at the rifle range, with emoji-laden captions like “If you want a reason to unfollow: Christ is king, abortion is murder, and all illegals must be deported,” and “POV: You were assigned intelligent at birth, but you identify as liberal <clown emoji>.”

Though Sam has never lived in the United States, he became an assiduous student of MAGA ideology. “Every day I’d write something pro-Christian, pro-Second Amendment, pro-life, anti-abortion, anti-woke, and anti-immigration,” he tells me.

The grift seemed almost too obvious, but to Sam’s astonishment, he says the account “blew up.”

“Every Reel I posted was getting 3 million views, 5 million views, 10 million views. The algorithm loved it.” he claims. Within a month, Emily Hart had more than 10,000 Instagram followers, many of whom also subscribed to her softcore AI-generated content on the OnlyFans competitor Fanvue. And between Fanvue subscriptions and selling MAGA-themed T-shirts (one sample message reads ”PTSD: Pretty Tired of Stupid Democrats”), Sam estimates he was making a few thousand dollars a month.

“I was spending maybe 30 to 50 minutes of my day, and I was making good money for a medical student,” he says. “In India, even in professional jobs, you can't make this amount of money. I haven’t seen any easier way to make money online.”

Emily Hart is one of a slew of AI-generated hot girl MAGA influencers inundating social media, thanks to technologically savvy young men like Sam capitalizing both on pro-Trump sentiment and Americans’ relative lack of digital literacy.

The influencers are created from a specific template: they tend to be white and blonde, with jobs as emergency responders. (A lot of them are cops, firefighters, or EMTs.) They also incorporate right-wing views into all of their content, railing about immigration or the Epstein files or pronouns while posing in American flag bikinis or MAGA hats—often both.

Valerie Wirtschafter, a fellow at the Brookings Institution studying emerging tech and democracy, says while the trend of fake profiles isn’t new, “AI has made them more believable, and there has perhaps been an amplification of it.”

Though many social media platforms, including Instagram, require creators to disclose if their content is AI-generated, such guidelines are enforced only in a slapdash fashion. (Emily’s posts were not labeled as AI-generated, and Sam says he was unable to monetize her account on Instagram itself.)

Female MAGA influencers tend to do well on such platforms for a few reasons. They’re a relative rarity in the MAGA movement: Unlike their Gen Z male counterparts, 18- to 29-year-old women overwhelmingly skew liberal. Young MAGA women are therefore “more attention-grabbing,” Wirtschafter says, citing the uproar over the likely AI-generated “Swifties for Trump” photo Trump posted on TruthSocial during the 2024 campaign as one example.

The same logic, however, apparently does not apply to left-wing influencer accounts, as Sam learned when he created a short-lived liberal counterpart for Emily on Instagram: “Democrats know that it’s AI slop, so they don’t engage as much.” (Sam’s explanation for why MAGA influencer accounts work is blunt: “The MAGA crowd is made up of dumb people—like, super dumb people. And they fall for it.”)

The algorithm also favors controversial views, making politically polarizing content more successful. This was Sam’s experience in running Emily’s account, which he characterized as “rage bait.” Even though liberals would flock to the page to leave irate comments, they were still clicking. “It’s a win-win situation, because you’re getting engagement anyway, and your content will go viral,” he says.

Lately, he says he’s noticed that “pro-Nazi, pro-Hitler content” has been getting especially high engagement on platforms like Reels, speculating that an AI hot girl Nazi influencer “would blow up. It would just break all the records.” (When asked about this claim, a Meta spokesperson said, “We prohibit content that glorifies, supports, or represents Nazism, and we remove it when we find it.”)

In recent months, the phenomenon has attracted more notice, especially after a Washington Post article charted the rise of Jessica Foster, a leggy blonde Army service member who went viral for posting a selfie with President Donald Trump and Vladimir Putin. Though her Instagram account was clearly fake, it garnered more than a million followers in just over four months, which “Jessica Foster” appeared to capitalize on by promoting feet pics. (The account has since been taken down; an account for Foster has since been added to Fanvue.)

Another popular account, @mayflowermommy13, featured brief videos of a brunette woman in a car or in her kitchen, gazing coquettishly at the camera with captions like “If this <American flag emoji> is your Pride Flag, I want to be friends #letsMAGA.” Her followers ate it up: “Not a democrat lib in the world looks like this folks!!! Young fellas pay attention,” reads one top comment. (The account appears to have been removed after WIRED reached out to Meta for comment.)

Because OnlyFans also has a policy requiring AI disclosure, as well as creators authenticating their identities before joining, those trying to profit off hot girl MAGA accounts gravitate toward OnlyFans competitors, where such policies are less rigorously enforced. Fanvue, one of the most popular options, has differentiated itself by allowing AI-generated content.

Though he did not actively promote Emily’s Fanvue account for fear of alienating her conservative MAGA fan base, Sam says he used Grok AI to generate nude photos of her and uploaded them to the platform, with Emily’s fans sending him payments for exclusive content and exchanging messages. “I was basically doing nothing,“ he says. “And it was just flooded with money.” He says he made a few thousand dollars off the account in a few days, though he did not enjoy the interactive aspects. “Once a guy sent me a video with Emily’s nude on a tablet on a pillow, and he was basically recording himself fucking the pillow,” he says. “It was incredibly weird, but he sent me a $50 tip, so I was like, OK, do what you want.”

Few of the fans cared whether Emily was real, Sam says. This is very much in line with the psychology of the average hot girl MAGA fan, according to Wirtschafter. Whether it's plausible that a sexy blonde nurse would love Christ, ICE, and flashing her boobs for strangers is secondary to the fact that many, many people want to believe it is. “Even among some digital natives, there’s a perspective of, ‘Well, I don’t actually care if this is true. I like the sentiment of it,’” she says.

Even though platforms like Meta ostensibly require AI content to be labeled, it can often escape detection, and accounts like Emily’s continue to proliferate.

To Meta’s credit, however, emily_hart.nurse’s life on Instagram was relatively brief. In February, Emily’s account was officially banned after Instagram flagged it for “fraudulent” activity, though her Facebook account is still active.

Sam says even if her account had not been banned, he probably would have stopped posting anyway. He doesn’t have any regrets about creating Emily—“I don’t feel like I was scamming people,” he says. After all, he was getting paid, and people were happy with the content he was making. But he’s moved on from the AI hot girl influencer niche. He says he needs to shift to focusing on his studies.

Originally reported by Wired