top of page

Deepfake Romance Scams: The New Face of Cyber Fraud

ARPITA (BISWAS) MAJUMDER | DATE: AUGUST 01, 2025


Introduction: Love in the Time of AI

 

Imagine falling in love via video call—with someone who doesn’t exist. In the era of AI-generated media, romance scams have become frighteningly realistic, using deepfake videos and cloned voices to manipulate emotions and extract money. These are not mere catfishers—they are emotionally engineered fraudsters, powered by AI.


ree

A Dangerous Evolution in Scamming

 

What was once limited to text-based “catfishing” is now morphing into hyper-realistic emotional manipulation. Deepfake romance scams harness AI-generated video, voice, and chat tools to convincingly impersonate someone—and prey on victims’ hearts and wallets.

 

What Are Deepfake Romance Scams?

 

Deepfake romance scams are a modern evolution of traditional romance fraud: criminals deploy AI-generated videos, voices, profile images, and even chatbot conversations to impersonate romantic partners. They build trust, feign affection, and eventually request money under emotional pretexts, such as emergencies, investments, or travel. The emotionally intelligent fabric of these scams makes them harder to spot and vastly more believable.

 

How They Operate: Step by Step

 

Target Selection and Profiling: Scammers often pre-screen targets—widowed or lonely individuals—via social media. In Hong Kong, one ring reportedly collected $46 million using deepfake romance scams.

 

Initial Contact: A deepfake persona—perhaps a model, actor, or digital nomad—initiates chat on dating sites or through direct messages. They may use stolen photos or AI-generated faces.

 

Relationship Building: Over days or weeks, the scammer sends affectionate messages, poems, or video calls that feel real—including real-time face-swapped AI. Over time, emotional trust becomes deep attachment.

 

Financial Requests: Once trust is gained, requests begin: travel funding, medical bills, fake emergencies, or investment opportunities. Victims have sent lavish amounts—some between £17,000 and £850,000.

 

Maintenance and Escalation: Scams can stretch for months or even years. Some perpetrators fabricate dramatic excuses: needing surgery, kidnapping, or legal trouble to extract more funds.

Victims often experience significant debt and emotional trauma that may persist even after the scam ends.


ree

Why Deepfakes Make Romance Fraud More Potent

 

Ultra-realistic visuals and voices: With just a few seconds of video or an image, scammers can create believable video calls or voice messages. In one case, an AI voice persuaded a finance worker in Hong Kong to transfer $25 million to a fake CFO.

 

Emotionally adaptive conversations: Deepfake-enabled chatbots and AI (such as LLM tools) learn details about their targets—likes, names, interests—and adapt messaging dynamically, reinforcing a false bond.

 

Scalable personalization: Declared vulnerable ages and emotional states are targeted at scale, leveraging psychological triggers that encourage compliance.  


Real Cases: Scams That Made Headlines

 

French woman & “Brad Pitt”: Lost $850,000 over 18 months to AI-generated videos and texts from fake Pitt.

 

Nikki MacLeod (Scotland): A 77-year-old retired lecturer sent £17,000 to a fake offshore worker named “Alla Morgan,” convincing her via AI video.

 

Lisa Nock (UK): Fooled by a fake Dr. Chris Brown over 2½ years, handing over her monthly disposable income (~£40/month) and even being asked for £40 million.

 

Paul Davis (UK): Received AI-generated video from “Jennifer Aniston” that professed love, then sent Apple gift cards (~£200).

 

In each case, emotional manipulation combined with AI-made media made detection extremely difficult.

 

Why These Scams Are Growing

 

AI tools are now accessible: Anyone can generate realistic deepfakes using open-source tools or web-based services.


ree

Public trust: Impersonating a public figure or injecting emotional intimacy significantly lowers victims’ skepticism.

 

Low legal risk: Scammers often operate abroad, hide behind anonymized comms, and exploit legal gray areas. Victims may be too embarrassed to report.

 

Why These Scams Are So Effective


Psychological manipulation: Deepfake technology enables empathetic, tailored messaging, making victims feel uniquely understood.

 

Lower barrier to entry: Accessible AI tools allow scammers to create believable deepfakes with minimal technical skill.

 

Limited detection tools: While detection research like DeepRhythm or WaveVerify exists, most victims lack access and awareness.

 

Rapid scale: Hybrid operations using AI and human oversight allow scammers to run thousands of concurrent tailored cases.

 

Impact: Emotional, Financial, Psychological

 

Victims face:

  1. Severe financial losses, sometimes life-altering (hundreds of thousands).

  2. Emotional devastation: grief, shame, depression, loss of trust.

  3. Lingering trauma, with many struggling to rebuild confidence.

 

Detection & Defense Measures


Human Awareness Still Crucial: As automated detection tools lag, human intuition and skepticism are more effective—especially during video calls. Weird eye movements, lip-sync issues, or frozen frames can be red flags.

 

Technological Aids: Tools like Vastav AI use metadata, image forensics, and confidence scoring to flag deepfake content. Some GAN-based systems report detection accuracy above 95%.

 

Platform Policy & Regulation:

Several jurisdictions have enacted or proposed deepfake legislation:

  1. The U.S. TAKE IT DOWN Act mandates the removal of non-consensual deepfake imagery within 48 hours.

  2. The FBI and financial institutions issue regular alerts on romance fraud, including tactics involving AI and celebrity impersonation.

 

How to Protect Yourself

 

ree

Verify their identity:

  1. Reverse-image search profile photos.

  2. Request a live video call with movement tests—turn heads, change expression. Realistic deepfakes struggle with dynamic motion.

 

Trust your instincts:

  1. Be suspicious of declarations of love within days or financial appeals under emotionally urgent contexts.

  2. Ask for consistent social media activity; new or inconsistent profiles are red flags.

 

Be cautious of financial requests:

Never send gift cards or money to strangers online. Scammers prefer payments that are nearly impossible to reverse.

 

Use detection tools where possible:

Tools like Vastav AI analyze metadata, heatmaps, confidence scores to flag deepfakes.

 

Report suspicious behavior:

Contact law enforcement and report online scams to authorities such as the FBI's IC3.


ree

Broader Implications: Beyond Individual Victims

 

Financial fallout: Scams like this fuel global fraud, contributing to losses projected into the tens of billions annually.

 

Regulatory urgency: The EU's AI Act (2024) takes a strong risk-based approach, while tools like the proposed Digital India Act aim to address AI misuse.

 

Corporate risk management: Organizations must train staff to spot deepfake calls in business contexts—executive impersonation is now a major threat.

 

The Road Ahead: Trends & Challenges

 

AI sophistication will only increase: Voice, video, photo and chat bots will become more lifelike and harder to detect.


Detection tech must scale: Passive screening, real-time verification, and forensic watermarking will be essential.

 

Education is critical: Widespread awareness campaigns can arm older and vulnerable adults against emotional manipulation.

 

Meanwhile, cross-sector collaboration between law enforcement, tech platforms, and financial regulators is vital to prepare for scams that increasingly blend emotional trust and synthetic intimacy.


ree

Conclusion: Love Isn’t Real Unless Verified


Deepfake romance scams are the confluence of cutting-edge AI and emotional manipulation. They blur the line between virtual affection and cybercrime—leading to real financial and emotional damage. As these scams become more personalized, widespread, and technologically convincing, vigilance is your best defense.

Protect yourself by verifying identities, resisting rapid emotional escalation, and asking questions. No matter how real it feels—it’s always worth confirming it's real.

 

Citations/References

  1. How romance scammers use deepfakes to deceive victims | McAfee AI Hub. (n.d.). McAfee. https://www.mcafee.com/ai/news/how-romance-scammers-are-using-deepfakes-to-swindle-victims/

  2. Oak, R., & Shafiq, Z. (2025, March 25). “Hello, is this Anna?”: Unpacking the Lifecycle of Pig-Butchering Scams. arXiv.org. https://arxiv.org/abs/2503.20821

  3. How to spot Deepfake Scams. (2024, October 30). https://www.ncoa.org/article/understanding-deepfakes-what-older-adults-need-to-know/

  4. Hannah, M. (2025, January 14). I was conned out of 17k by ‘deepfake’ girlfriend – I was completely convinced they were real. . . The Scottish Sun. https://www.thescottishsun.co.uk/news/14165928/ai-deepfake-romance-scam-scotland-gmb/

  5. Bîzgă, A. (n.d.). Jennifer Aniston Deepfake romance scam: Victim fooled by AI impersonation. Hot For Security. https://www.bitdefender.com/en-us/blog/hotforsecurity/jennifer-aniston-deepfake-romance-scam-victim-fooled-by-ai-impersonation

  6. March 2025 | This month in Generative AI: AI-Powered Romance Scams. (n.d.). https://contentauthenticity.org/blog/march-2025-this-month-in-generative-ai-ai-powered-romance-scams

  7. Roscoe, J. (2025, June 4). Deepfake scams are distorting reality itself. WIRED. https://www.wired.com/story/youre-not-ready-for-ai-powered-scams/

  8. Wikipedia contributors. (2025, July 15). TAKE IT DOWN Act. Wikipedia. https://en.wikipedia.org/wiki/TAKE_IT_DOWN_Act

  9. Romance scams. (2024, August 19). Federal Bureau of Investigation. https://www.fbi.gov/how-we-can-help-you/scams-and-safety/common-frauds-and-scams/romance-scams

  10. Salomon, S. (2025, August 1). What is a Deepfake and How Do They Impact Fraud? Feedzai. https://www.feedzai.com/blog/deepfake-fraud/

  11. New ReversePhone study reveals Surge in AI deepfake voice scams: The chilling reality of 2025’s most dangerous phone threat. (2025, July 2). Morningstar, Inc. https://www.morningstar.com/news/accesswire/1040998msn/new-reversephone-study-reveals-surge-in-ai-deepfake-voice-scams-the-chilling-reality-of-2025s-most-dangerous-phone-threat

  12. Synovus. (n.d.). How deepfake scams use familiar faces to scam victims. Fraud Prevention and Security Hub. Retrieved [insert retrieval date], from https://www.synovus.com/personal/resource-center/fraud-prevention-and-security-hub/fraud-hub-education-and-prevention/latest-fraud-trends/how-deepfake-scams-use-familiar-faces-to-scam-victims/

  13. AI-Powered Romance Scams: How to spot and Avoid them. (n.d.). Security Corner. https://www.ussfcu.org/media-center/security-corner/blog-detail-security-corner.html?cId=96630&title=ai-powered-romance-scams-how-to-spot-and-avoid-them

  14. Report, K. K. C., & Report, K. K. C. (2025, February 11). How to not fall in love with AI-powered romance scammers. Fox News. https://www.foxnews.com/tech/how-not-fall-love-ai-powered-romance-scammers

  15. Rao, A. (2024, June 12). Deepfake romance scam raked in $46K from victim—here’s how it worked. AOL. https://www.aol.com/deepfake-romance-scam-raked-46-061210700.html

  16. Dhaliwal, J. (2025, February 12). AI chatbots are becoming romance scammers—and 1 in 3 people admit they could fall for one. McAfee Blog. https://www.mcafee.com/blogs/privacy-identity-protection/ai-chatbots-are-becoming-romance-scammers-and-1-in-3-people-admit-they-could-fall-for-one/

  17. Singh, E. (2025, July 2). I was scammed out of hundreds by ‘Jennifer Aniston’ who told me she loved me & needed cash for her ‘Apple s. . . The Sun. https://www.thesun.co.uk/news/35655414/jennifer-aniston-love-scam-ai/

  18. Costello, M. (2025, January 24). The rise of AI-Powered Deepfake Scams: Protect Yourself in 2025 - RCB Bank. RCB Bank. https://rcbbank.bank/learn-the-rise-of-ai-powered-deepfake-scams-protect-yourself-in-2025/

  19. Moseley, S. (n.d.). Automating Deception: AI’s evolving role in romance Fraud. Centre for Emerging Technology and Security. https://cetas.turing.ac.uk/publications/automating-deception-ais-evolving-role-romance-fraud


Image Citations

  1. Bîzgă, A. (n.d.). Jennifer Aniston Deepfake romance scam: Victim fooled by AI impersonation. Hot For Security. https://www.bitdefender.com/en-gb/blog/hotforsecurity/jennifer-aniston-deepfake-romance-scam-victim-fooled-by-ai-impersonation

  2. Explainers, F. (2024, October 15). How deepfake romance scammers stole $46 million from men in India, China, Singapore. Firstpost. https://www.firstpost.com/explainers/how-deepfake-romance-scammers-stole-46-million-from-men-in-india-china-singapore-13825760.html

  3. Admin, & Admin. (2024, May 2). Realtime deepfake dating Scams - hands on IT services. Hands On IT Services - IT Support for you and your business. https://hoit.uk/it_security/realtime-deepfake-dating-scams/

  4. Salomon, S. (2025, August 1). What is a Deepfake and How Do They Impact Fraud? Feedzai. https://www.feedzai.com/blog/deepfake-fraud/

  5. Reuters. (2025, May 15). Deep love or deepfake? Dating in the time of AI. The Economic Times. https://economictimes.indiatimes.com/news/international/global-trends/deep-love-or-deepfake-dating-in-the-time-of-ai/articleshow/121188004.cms?

  6. Love is in the (AI)R. (2025, February 13). News. https://news.illinoisstate.edu/2025/02/love-is-in-the-air/

 

About the Author

Arpita (Biswas) Majumder is a key member of the CEO's Office at QBA USA, the parent company of AmeriSOURCE, where she also contributes to the digital marketing team. With a master’s degree in environmental science, she brings valuable insights into a wide range of cutting-edge technological areas and enjoys writing blog posts and whitepapers. Recognized for her tireless commitment, Arpita consistently delivers exceptional support to the CEO and to team members.

 
 
 

Comments


bottom of page