Skip to main content Skip to main menu Skip to footer

Prevent deepfake fraud attacks

Prevent deepfake fraud attacks

Decrease Text Size Increase Text Size

Page Article

Deepfakes are artificially created images, videos, and audio designed to emulate real human characteristics. Deepfakes use a form of artificial intelligence (AI) called deep learning. Bad actors can leverage the technology behind deepfakes to commit identity theft, blackmail, and fraud.  The best-known type of deepfake is a face-swap video, which transposes one person's facial movements onto someone else's features. Another type of deepfake is voice cloning, which copies a person's unique vocal patterns in order to digitally recreate and alter their speech.

Deepfakes in 2025: What’s New

AI video tools now let anyone generate realistic clips that look like social videos—often using scanned faces or “cameo” style avatars. Because visual and audio quality can be high, you should treat convincing media as unverified until proven otherwise. The safest approach is to verify the source and context, not the pixels.

Deep Fake Techniques:

  • Deepfake vishing, a type of narrowcast threat, uses cloned voices for social engineering phone calls. This technique has innumerable applications, including identity theft, imposter scams, and fraudulent payment schemes. A well-crafted deepfake vishing operation can exploit the call recipient’s trust in the impersonated party.  Current technology enables realistic voice cloning, which can be controlled in real-time with keyboard inputs.
  • Fabricated private remarks, a type of broadcast threat, are deepfake video or audio clips that falsely depict a public figure making damaging comments behind the scenes. 
  • Synthetic social botnets, another broadcast threat, are also of primary concern. Fake social media accounts could be constructed from synthetic photographs and text and operated by AI, potentially facilitating a range of financial harm against companies, markets, and regulators.

Scenarios Targeting Individuals

Identity Theft:

  • Deepfake synthesized voice initiating a fraudulent wire transfer.  
  • Deepfake audio or video used to create bank accounts under false identities, facilitating money laundering.
  • Deepfakes in social engineering campaigns to gain unauthorized access to large databases of personal information. A company official might receive a deepfake phone call asking for his username and password. 

Imposter Scams:

  • Criminals impersonate “the government, a relative in distress, a well-known business, or a technical support expert” to pressure the victim into paying money.

Cyber Extortion:

  • Scammers often manipulate victims by threatening imminent harm unless money is paid—for example, claiming the victim or a loved one faces criminal charges or suspension of government benefits. 
  • Scammers might clone the voice of a specific individual, such as a victim’s relative or a prominent government official known to many victims. 
Remote Work Positions:
  • The use of voice spoofing, or potentially voice deepfakes, during online interviews of potential applicants. In these interviews, the actions and lip movements of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking.
  • The use of deep fakes and stolen Personally Identifiable Information (PII) to apply for remote work positions. Some roles include access to customer PII, financial data, IT databases, and proprietary information.

Scenarios Targeting Companies

Payment Fraud:

  • Criminals often hack or spoof an email account of a chief executive officer (CEO) and then contact a financial officer to request an urgent wire transfer or gift card purchase. 
  • Criminals may also masquerade as trusted suppliers (using false invoices) or employees (diverting direct deposits).  
  • Deepfakes could make phone calls used in business email compromise schemes sound more authentic. 

How to assess a suspicious video or audio (provenance first)

  • Check provenance: Look for authenticity labels (e.g., Content Credentials/C2PA) and identify where the clip first appeared. Seek reputable corroboration.
  • Verify out-of-band: If the clip requests money, credentials, or urgent action, call a known number (not in the message) to confirm.
  • Use a code word: Agree on a family or company code word in advance for urgent requests.
  • Challenge live: On live calls, ask for spontaneous actions (turn, move an object, say a random phrase). Deepfakers struggle with novel multi-step prompts.
  • Then look for artifacts: mouth–consonant mismatch, odd room acoustics, inconsistent reflections/shadows, hair/jewelry edges, sudden frame resets.

Payments Risk: Don’t Act from a Link or a Clip

  • Never pay from a link in a text/DM/email you didn’t expect. Navigate to the site or app yourself.
  • First-time recipient = extra caution: Pause and verify by phone using a known contact number.
  • If crypto is requested: Consider it high-risk. Funds are typically irreversible; call your bank first.

For Companies: Stop Deepfake-Driven Payment Fraud

  • Out-of-band verification: Require phone verification using directory numbers for any payment change or urgent request—even if there’s a “video from the CEO.”
  • First-time recipient controls: Use caps or short holds for new payees on RTP/FedNow/Zelle/wires.
  • Staff scripts: Give frontline teams empathetic language to pause pressured payments and involve a specialist.

How to prevent deepfake fraud

  • Vigilance: Pay close attention to suspicious voice messages or calls that may sound like someone familiar yet feel slightly off.
  • Machine learning and advanced analytics: Fight fire with fire by deploying deepfake detection and analysis.
  • Layered fraud prevention strategy: Use identification checks such as verification, device ID, behavioral analytics, and document verification simultaneously to counter how fraudsters may deploy or distribute deepfakes.
  • Zero Trust: Verify whatever you see. Double and triple check the source of the message. Do an image search to find the original, if possible.


Page Footer has no content