The Federal Bureau of Investigation has issued a stark warning about the growing threat of “deepfakes” being used in cyber extortion.
In a recent report, the FBI said that malicious actors are using deepfakes to manipulate photographs or videos, often obtained from social media accounts or the open internet, and create sexually-themed images that appear authentic.
They then circulate these photos on social media or pornographic websites for the purpose of sextortion schemes or to harass the victim.
The FBI mentioned that the improvements in the quality, customizability, and accessibility of artificial intelligence-enabled image generators have further contributed to the growth of deepfakes.
The commission said it has received reports from victims, including minors, whose photos or videos were altered to create explicit content that was then publicly circulated.
Many victims were unaware their images had been copied, manipulated, and circulated until it either came to their attention or they stumbled across them online.
Once the manipulated content is circulated, victims face significant challenges in preventing its continual sharing or removal from the internet.
“Malicious actors have used manipulated photos or videos with the purpose of extorting victims for ransom or to gain compliance for other demands (e.g., sending nude photos),” the FBI said.
The federal agency recommended that people exercise caution when posting or direct messaging personal photos, videos, and identifying information on social media, dating apps, and other online sites.
Moreover, people should use discretion when posting images, videos, and personal content online, particularly those that include children or their information, as they can be captured, manipulated, and distributed by malicious actors without your knowledge or consent.
Applying privacy settings on social media accounts, running frequent online searches for personal information, using reverse image search engines, exercising caution when accepting friend requests or communicating with unknown or unfamiliar individuals, and securing online accounts with complex passwords and multi-factor authentication are also among the FBI’s recommendations.
Deepfakes Used to Target Crypto Users
As of late, there have also been instances where deepfakes were used to target unsuspicting crypto users.
For instance, in May, a deepfake of Tesla and Twitter CEO Elon Musk was created to promote a crypto scam. The video contained footage of Musk from past interviews, manipulated to fit the fraudulent scheme.
Scam promoters have long resorted to deepfakes to drum up demand among potential crypto investors.
Scammers impersonate anyone from influencers to high-profile crypto figures, but also ordinary people to gain victims’ trust.
Last year, Miranda, an e-commerce worker who did not wish to disclose her real name because her company had not given her permission to speak publicly, was targeted by such an attack when imposters released a deepfake video of the Melbourne woman promoting a crypto scam and published it on her Instagram account.