Decoding the tricks used to extort money through deepfakes.

Avatar photo

by Editor CLD

Besides using AI to scam victims through fake video calls, cybercriminals are also using AI to turn ordinary photos into blackmail tools.

Hackers exploit everyday photos (at the beach, out and about, wearing modest clothing) that victims (usually women or teenagers) post publicly on Facebook/Instagram, using AI tools to "undress" or superimpose the victim's face onto the body of a pornographic model.

The hacker then sends this fake photo to the victim along with a threat demanding money or that they send the sensitive images and videos to family members, colleagues, or post them on social media.

Technology used

  • “Undress” AI Apps: Telegram web/bot tools use Generative AI models (such as Stable Diffusion in-painting) to predict and redraw naked body parts based on the body structure in the original image.
  • Face Swapping: Swapping the victim's face onto existing explicit videos/photos with high realism and synchronized expressions (DeepFaceLab, ReFace).

The process of ransomware attacks using fake images created by AI.

  • Step 1: Hunting
  • Hackers use tools to scan publicly accessible Instagram/TikTok accounts, looking for clear photos of faces or full-body shots.
  • Step 2: Crafting
  • Using AI to "remove" clothes from original photos.
  • Alternatively, find pornographic videos featuring actors with similar body types and use Deepfake to superimpose the victim's face onto the video.
  • Step 3: Psychological attack
  • Send fake photos (partially blurred or clear) to the victim.
  • He included a list of the victim's friends to prove he was capable of actually distributing the information.
  • Step 4: Request a transfer via Crypto or a fake account.

Identification method

  • Abnormal anatomical details: AI-generated images often have errors in the fingers (extra/missing fingers), navel, teeth, or areas of skin adjacent to hair/clothing that appear blurry.
  • Lighting and Shadows: The direction of light on the face (original image) does not match the direction of light on the body (composite image).
  • Source of the image: The victim may recognize the background and head pose as identical to a normal photo they had previously posted.

Prevention and treatment solutions

  • PreventionLimit posting personal photos in "Public" mode. Consider using stickers to cover sensitive areas or adding a watermark with your name to the photos when posting.
  • Dealing with blackmail:
    • Don't transfer money: Transferring money won't solve the problem; the hacker will continue to demand more. Stay calm and block: Block communication and temporarily lock social media accounts.
    • Reporting: Report to the platform (Facebook/Telegram) and the authorities.

Leave a Reply

Your email address will not be published. Required fields are marked *