Finance worker pays out $25 million after video call with deepfake ‘chief financial officer’ | CNN::A finance worker at a multinational firm was tricked into paying out $25 million to fraudsters using deepfake technology to pose as the company’s chief financial officer in a video conference call, according to Hong Kong police.
I’m highly doubtful that scammers could get enough real video of multiple employees in the same company to train an AI to pull this off convincingly. Celebrities, yes. Regular people, no
However, Occam’s Razor tells me this employee knows exactly where that money went and plans to quietly slip away to a tropical island to retire, after getting fired for being “gullible.”
Insider threat. My organization archives our town halls with the President. There are hours of video available on the internal site.
And if not that, you also have internal meetings where managers love to turn their cameras on.
It also doesn’t have to be perfect, just good enough.
Same. Not just town halls, but our internal video archive has thousands of hours of recordings of training, project kickoffs and retros, across hundreds of teams.
You’d be surprised. Roop isn’t perfect but it can be quite convincing sometimes and it doesn’t require much training at all. It’ll take a single face picture. Same with voice cloning, you only need 15-30 seconds of source audio to make a decent clone. You can use even less, but the quality won’t be as good. Just those two pieces of source material could be easily obtained by anyone working near the CFO’s office/anyone that knows his routine before/after work (e.g. going to a cafe, gym, etc).
Not to mention CFOs are going to have public interviews and such most likely available to TAs.
There are lots of stupid people.
You need lots and lots of real video of a person to train an AI to make fake videos of that person. So, unless the CFO and the other allegedly faked employees are all youtubers, there’s very good reason to consider more plausible explanations.
To your point, you are correct. There are lots of stupid people. This includes people that will blindly believe that AI can just magically do anything and not even consider simpler explanations for things like this.
I think it was just last year there was a story about some school official claiming to have been duped into paying scammers millions from the schools funds, only to later have been caught making the whole thing up in an attempt to steal the money. (Maybe somebody remembers enough to find a link) So it’s not remotely far fetched to think that’s what could be happening here.
Why even use AI? Mocap + features reconstructed from some photo + your acting under that mask + one spam call to steal voice sample + pitch shifter to slightly modify your voice. We have fucking Hololive where vtubers are anime girls streaming in real time. If it means at least $1 mil, you can find the right people to do a good deepfake under a budget. And they won’t probably use LLM if they worth employment as it’s unreliable and, as you said, needs training data, needs processing power, time etc. You’d be surprised how many things people do as shitposts in AE and Blender. Making Biden say an N-word to Obama is easier than you think when you know who to ask.
It’s not an LLM problem, it’s their stupid security, verification processes and this person be a gullible idiot.
I spend hours and hours a week in video meetings. If the scammers had access to the footage, it’s easily done. Easier, even, given that all the available footage would be from the right context.
[off topic?]
Similar story from back in the 1980s. Reagan’s deregulation of banks lead to a Wild West atmosphere where bankers felt encouraged to take big risks. [https://en.wikipedia.org/wiki/Neil_Bush] [https://en.wikipedia.org/wiki/Savings_and_loan_crisis]
One day, a smooth looking customer walks into a Texas bank. He had on the right suit, with the proper Texas power broker Stetson hat and ostrich skin cowboy boots. He had a thick business plan in a beautiful portfolio. He told the bank VP that his company was planning a huge expansion and wanted a bank that could expedite their plans. He asked if the bank could OK a $1 million loan in one business day. The VP assured him they could, and the next morning they presented him with a check. He walked out and was never heard from again.
Thank you for the additional read. I always wondered if I could do something like that. Just for the sake of it, with research, dreasing, acting and managing the right speech\behavior patterns to calm down their alerts. I’d honestly buy a game about that, like a reverse LA Noire. I think many millenials and zoomers would love to play-pretend they aren’t themselves for a moment, see the popularity of Mafia in their generation at least in x-USSR: https://en.m.wikipedia.org/wiki/Mafia_(party_game)
One of the most savage takedowns of the corporate mindset in American history is a Broadway musical “How To Succeed In Business Without Really Trying.”
The hero’s first step to the top is to find a company that is so big that no one person has any idea of what is actually going on.
I can think of a dozen recent cases where someone simply lied their way into a billion dollars; the cyptocurrency bank guy and the blood test lady spring to mind.