Advancements in AI have made it possible to create deepfake videos and voices in which scammers write the scripts to try and illegally obtain others’ crypto.
American venture capitalist Tim Draper issued a warning on social media that scammers are attempting to con crypto users using an artificial intelligence (AI) voice generator.
In an Oct. 19 post on X (formerly Twitter), Draper warned his roughly 254,000 followers to be mindful of “thieves” using AI to create an approximation of his voice. According to the venture capitalist, “AI is getting smarter” as evidenced by followers seemingly reporting Draper tried to get them to send cryptocurrency.
Apparently AI is getting smarter, and people are using my voice to try to get you to send money (crypto). Please know that I will never ask the public X followers for money. If anyone asks, it is not me. They are thieves.
— Tim Draper (@TimDraper) October 19, 2023
Recent advancements in AI have made it easier for the average person to hear their favorite celebrity’s voice or watch a video of politicians saying whatever they want through certain programs. Following the collapse of FTX in November 2022, scammers created a deepfake video of former CEO Sam Bankman-Fried offering compensation to affected users. A similar situation occurred with a deepfake of Tesla CEO Elon Musk in May 2022.
Draper, who once predicted that the price of Bitcoin (BTC) would hit $250,000 by 2023, was an early investor in the cryptocurrency. Despite losing roughly 40,000 BTC when Mt. Gox collapsed in 2011, he has continued to be an advocate for the space and digital assets.