The fast-moving nature of the crypto market means investors are under enormous pressure to quickly verify whether a video message is genuine.
Crypto investors are urged to keep their eyes peeled for upcoming “deepfake” crypto scams, with digital doppelgänger technology constantly advancing, making it harder for viewers to separate fact from fiction

The crypto industry is more “sensitive” than ever to deepfakes because “time is of the essence in decisions,” resulting in less time to verify the veracity of videos, David Schwed, chief operating officer at blockchain security firm Halburn, told Cointelgraph They get it.

Vote now!
According to OpenZeppelin technical writer Vlad Estop, artificial intelligence (AI) is being used for deep learning by manipulating and manipulating native media, such as video, photo and audio swapping faces

Estop noted that crypto scammers often use deepfake techniques to create fake videos of well-known personalities to execute scams.

An example of such a scam was the intense video of the former CEO of FTX in November, in which the scammer used old interview footage and Sam Bankman-Fried’s voice impersonation to direct users to a malicious website and “double their cryptocurrency ” he promised. . . . .

Schwed said the volatile nature of crypto makes people nervous and take a “better safe than sorry” approach, allowing them to get sucked into deeper outbreak scams. He observed that:

advertisement
The AI crypto platform delivered an average of 17 winner alerts per month in 2022
“If CZ posted a video that stopped withdrawals within the hour, would you immediately withdraw your money, or spend hours trying to figure out if the message was genuine?”
But Estop believes that although deepfake technology is advancing rapidly, it is still not “indistinguishable from reality”.

How to look deeper: Look at the eyes
Schwede suggests that a useful way to quickly spot deep pfaka is to see when the subject is blinking his or her eyes. If it sounds unnatural, there’s a good chance it’s deep fuck.

This is because image files from the internet are used to generate deep faxes where the subject’s eyes will often be open, Schwede explains, simulating the subject’s eye blinking in deep cooking.

“The best introductions, of course, ask questions that only a real person can answer, like “What restaurant did we meet at for lunch last week?” Schwede said.

Estop said there is also AI software available that can detect deepfakes and suggested the field should see major technological improvements.

He also gave some age-old advice: “If it’s too good to be true, it probably is”.

Related: ‘Yikes!’ Elon Musk Warns Users About Latest DeepFake Crypto Scam

Last year, Binance chief communications officer Patrick Hillman revealed in a blog post in August that a sophisticated scam had been created using a deep fake

Hillman noted that the team has “fooled some of the most intelligent crypto members” over the years and used past news interviews and TV shows to create deepfakes

He realized this when he started receiving online messages thanking him for talking to the project team about potentially listing his property on Binance.com

Earlier this week, blockchain security firm Slomist reported 303 blockchain security incidents in 2022, of which 31.6% were due to phishing, rug pulling, and other scams

Source: CoinTelegraph

LEAVE A REPLY