New Hacking Scheme
Cybersecurity experts report a new attack method used by hackers linked to North Korea. The attackers leverage deep learning technology to create fake video calls in order to deceive cryptocurrency company employees and install malware.
These fabricated video calls look highly realistic, allowing the hackers to successfully impersonate the victims' trusted contacts. Once they gain access, the criminals deploy malicious programs to steal confidential data and user funds.
This tactic is particularly dangerous as it relies on the use of advanced artificial intelligence technologies. Moreover, the hackers associated with North Korea are one of the most active and sophisticated cyberthreats to the crypto industry.
How to Protect Yourself?
Experts recommend:
- Be vigilant when receiving any suspicious video calls, even if they appear to be from known contacts
- Thoroughly verify the identity of the caller and the source of the contact
- Use multi-factor authentication to access corporate and personal systems
- Regularly update software and antivirus databases
Cyberattacks using AI-generated deepfake video calls pose a serious threat to the crypto sector. Vigilance, caution, and the use of modern cybersecurity measures are key to protecting against such sophisticated attacks.