Back
North Korea–Linked Hackers Use Deepfake Video Calls to Target Crypto Workers

North Korea–Linked Hackers Use Deepfake Video Calls to Target Crypto Workers

Hackers are using AI-generated video calls to impersonate trusted contacts and trick crypto workers into installing malware.

1/27/20265 min read32 views

New Hacking Scheme

Cybersecurity experts report a new attack method used by hackers linked to North Korea. The attackers leverage deep learning technology to create fake video calls in order to deceive cryptocurrency company employees and install malware.

These fabricated video calls look highly realistic, allowing the hackers to successfully impersonate the victims' trusted contacts. Once they gain access, the criminals deploy malicious programs to steal confidential data and user funds.

This tactic is particularly dangerous as it relies on the use of advanced artificial intelligence technologies. Moreover, the hackers associated with North Korea are one of the most active and sophisticated cyberthreats to the crypto industry.

How to Protect Yourself?

Experts recommend:

  • Be vigilant when receiving any suspicious video calls, even if they appear to be from known contacts
  • Thoroughly verify the identity of the caller and the source of the contact
  • Use multi-factor authentication to access corporate and personal systems
  • Regularly update software and antivirus databases

Cyberattacks using AI-generated deepfake video calls pose a serious threat to the crypto sector. Vigilance, caution, and the use of modern cybersecurity measures are key to protecting against such sophisticated attacks.

Share this article