Emotional scams designed to lure investments – When AI becomes a psychological manipulation machine.
Cybercriminals have now upgraded their scams from "manual" to "industrial" scale thanks to the support of artificial intelligence (AI). Instead of being able to chat with only 5-10 people, a hacker can now simultaneously "flirt" with thousands of victims [...] from Emotional scams designed to lure investments – When AI becomes a psychological manipulation machine.
by
Editor CLD
schedule23/01/2026
Cybercriminals have now upgraded their scams from "manual" to "industrial" scale thanks to the support of artificial intelligence (AI). Instead of being able to chat with only 5-10 people, a hacker can now simultaneously "flirt" with thousands of victims through AI chatbot networks on platforms like Telegram, Tinder, or Facebook Dating.
The "Pig Butchering" scam, a form of emotional manipulation designed to lure investment, has become more dangerous than ever thanks to AI automation:
Information gathering: AI scans public data on LinkedIn and Facebook to create a profile of your interests and occupation.
Personalized content: Instead of sending one generic email to 1000 people, AI composes 1000 different emails. For example: “Hi A, I see you recently attended conference X in Hanoi…”.
Absolute patience: The AI chatbot is willing to chat, wish you goodnight, and maintain conversations with you 24/7 to build trust.
Closing the deal with a real person: Only when you've completely "taken the bait" and intend to deposit money does the AI transfer power to the real scammer to carry out the final step: sending you a malicious link or inviting you to join virtual investment platforms.
How to identify an AI "virtual lover"
To avoid falling into the psychological trap of AI, you need to pay attention to the following signs:
The responses were perfect: messages were always grammatically correct, the writing style was sometimes a bit "textbook," and they were sent instantly, day or night.
Contextual amnesia: Sometimes AI repeats stories in a pattern or forgets important details that were previously discussed.
Constantly refusing to meet: The person always finds excuses to avoid video calls or meeting in person.
Rule 3: Don't let yourself be left to defend yourself.
In the age of AI, online trust needs to be subjected to rigorous verification:
ARE NOT Completely trusting strangers: Always be suspicious of accounts that know too much about your interests and private life through publicly available data on social media.
ARE NOT Investing through online offers: Apply the "Zero Trust" rule – Don't trust anyone online who mentions finances unless you've met them in person.
ARE NOT Sharing sensitive information: Limit the amount of personal details you publicly share on LinkedIn or Facebook to avoid being targeted by AI profile attacks.
Beware of the "perfect" attention on social media. A personalized scenario orchestrated by AI could cause you to lose all your assets in an instant.