Additional Coverage:
A counterfeit Instagram profile portraying a blonde Army soldier named Jessica Foster gathered over 1 million followers before investigators revealed it was an AI-created model. This account is part of a rising wave of artificial intelligence-generated military personas designed to attract followers and monetize online presence.
The watchdog organization Military Phony, which monitors false military claims, describes this phenomenon as “digital stolen valor.” These fabricated military identities seek to earn trust and financial gain by impersonating service members.
While the federal Stolen Valor Act prohibits falsely claiming certain military honors for profit, online impersonation-though not always meeting legal criteria-can cause comparable harm. This issue emerges alongside ongoing tensions between Karoline Leavitt and Donald Trump, following Leavitt’s recent admission that she is no longer aligned with his “team.”
Experts note that advances in AI technology are accelerating the creation of fake identities, making them more difficult to detect and increasingly effective at exploiting public trust. One such account, under the name Emily Hart, combined conservative political commentary with lifestyle posts before steering followers toward paid adult subscription services.
According to Wired magazine, the Emily Hart profile was developed by a 22-year-old medical student using the pseudonym “Sam,” who utilized AI-generated imagery to supplement income while studying. Sam employed Google’s Gemini AI to craft the persona, which was guided toward a conservative audience based on data suggesting older American men tend to financially support online creators more loyally.
The Pentagon declined to comment specifically on this trend, directing inquiries to the FBI given that impersonating military personnel violates federal law. A Defense Department official stated, “As impersonating a member of the armed forces is a violation of federal law, we refer you to the FBI.”
Legal experts clarify that the distinction between protected speech and illegal conduct often hinges on financial motives. Eugene Volokh, a law professor emeritus at UCLA, explained that falsely claiming military service purely for fame or influence generally falls under constitutional protection.
However, this protection does not extend to instances where false claims are made to obtain money or other valuables. In such cases, individuals may face civil lawsuits, enforcement actions, or criminal charges.
This legal standard applies equally to AI-generated personas.
Meta, the parent company of Instagram, mandates that users disclose AI-generated content but has not publicly detailed how this policy is enforced. Many fraudulent accounts remain active until they amass substantial followings and generate significant revenue.
Military Phony also highlighted that AI-generated images frequently contain inaccuracies in uniform details and rank insignias-elements that trained observers use to identify counterfeit profiles.