NAIROBI, Kenya — Imagine waking up to find yourself peddling lemon balm tea as a weight-loss miracle, starring in a fake ad as a gynecologist, or worse—championing a foreign political regime you’ve never heard of. Welcome to the very real world of digital doubles gone rogue.
This is exactly what happened to Simon Lee, a South Korean actor, after he licensed his face to an AI marketing firm.
What he got in return wasn’t fame or a brand deal—it was an endless parade of cringe-worthy deepfake clips on TikTok and Instagram featuring his AI-generated twin.
“If it was a nice advertisement, it would’ve been fine,” he told AFP, “but obviously it is such a scam.”
Here’s how easy it is for your digital identity to spiral out of control—and why AI avatar deals might be the new Wild West of influencer marketing.
The appeal is obvious. AI avatars are faster, cheaper, and less risky than traditional shoots. All it takes is half a day with a green screen, a teleprompter, and a few exaggerated facial expressions.
That’s enough for companies to build a digital clone that speaks any language, with any tone, on any topic—from skincare advice to doomsday predictions.
“The performance of a real human—voice, facial movement, body language—is still superior,” said Alexandru Voica of UK-based AI firm Synthesia, “but this is close enough for most marketing.”
For actors like Adam Coy, the opportunity to license his face and voice for $1,000 seemed like easy money.
But that decision came back to haunt him when his avatar showed up in bizarre clips, including one where it claimed to be from the future announcing upcoming disasters.
Coy admitted he never imagined his image would be used this way, but his contract didn’t prohibit it—only ruling out use in pornography, or alongside alcohol and tobacco products.
Many of these contracts dangle a few thousand euros, depending on your fame level and the duration of the licensing agreement. But what they often don’t include is clarity. Or ethics.
“Clients didn’t fully understand what they were agreeing to at the time,” said business lawyer Alyssa Malchiodi. “These contracts often include worldwide, perpetual, and irrevocable rights.”
In plain terms: you give them your face once, and they get to use it forever. Across any media.
For anything they want—so long as it doesn’t violate a narrow list of restrictions. That means your AI twin could end up in scams, conspiracy theories, or worse, foreign propaganda.
Just ask British model Connor Yeates, who licensed his image to Synthesia for €4,600. He later found out his avatar had been used to promote Burkina Faso’s President Ibrahim Traore, a military leader who seized power in a 2022 coup.
“Three years ago, a few videos slipped through,” Voica admitted, noting Synthesia has since tightened its moderation policies.
Unfortunately, newer platforms are popping up without those filters. One AFP journalist was even able to generate an avatar making outrageous claims with minimal effort.
The bigger issue? Courts and lawmakers are scrambling to keep up with this AI-fueled identity crisis.
As technology accelerates, legal frameworks are lagging behind. That leaves creators—especially those without legal teams or high-paying gigs—vulnerable to exploitation.
“These are not invented faces,” Malchiodi warned. “They’re real people, unknowingly endorsing whatever message someone’s willing to pay for.”
So the next time an AI company offers you money to “star” in ads without leaving your couch, ask yourself: is it worth becoming the face of the future—without control over what that future says?



