The Danger of Deepfake Scams

Posted by:

|

On:

|

The rise of artificial intelligence (AI) has enhanced our lives in many ways. In the realm of cybersecurity, AI has bolstered defenses against threats. There are machine learning algorithms, enhanced anomaly detection, and automated response mechanisms for rapid response to and neutralizing threats. However, AI is also being used maliciously by threat actors. A popular use case among cybercriminals is using deepfake technology to scam individuals. Deepfake technology uses artificial intelligence and machine learning to create realistic and often deceptive videos and audio recordings. Often, threat actors use these deepfake videos to impersonate high-profile individuals like executives or celebrities to deceive them into believing they’re communicating with a legitimate source. Deepfakes can be used for many scams, such as persuading individuals to transfer funds or disclosing sensitive information through deceptive and convincing simulations.

Recently, an advertisement has been circulating on X, formerly known as Twitter, and YouTube of Ripple CEO Brad Garlinghouse. In this ad, the CEO is telling viewers to send their XRP to a specific address to have it doubled as a form of thanks for supporting the company. XRP is a cryptocurrency and the native token of Ripple. Of course, Garlinghouse has not made any comments or statements regarding XRP airdrops, and he has confirmed that it’s a scam. The surprising element is that the YouTube algorithm, designed to detect scams, did not flag this one.

Another recent deepfake scam impersonated Elon Musk. In this video, the Tesla CEO promotes the news of opening an investment platform called ‘Quantum AI.’ This video was shared on Facebook, and the attached link takes you to a website called ‘financial advisors.’ It appears the deepfake originally came from a video of an interview with Musk on CNBC’s YouTube channel. There is no credible news of Musk launching any ‘Quantum AI’ platform.

It can be challenging to detect deepfakes due to their realistic appearance. Mitigation of these AI scams relies on technological solutions, awareness, and vigilance. Unnatural facial expressions or inconsistencies in audio can be a clear giveaway of deepfakes. If you spot suspicious advertisement videos, ensure they come from verified profiles. Where applicable, implement strong authentication measures. Regarding cryptocurrency and phishing, consistently implement best practices such as never providing personal information in response to an unsolicited request, contacting the financial institution yourself, and never providing your password over the phone. If you believe you have fallen victim to a phishing attempt and money has been exchanged, your first step in potentially recovering lost assets is to deactivate/flag the method of payment used and reach out to local law enforcement.