How sophisticated deepfake scams aim to extract millions from construction companies

Deep fake. Deepfake and AI artificial intelligence video editing technology. Face of a person in editor. Machine learning concept. Fraud picture swap. Image: terovesalainen via AdobeStock - stock.adobe.com

If you haven’t heard the term ‘deepfake’ before, then the chances are you soon will.

Bosses at engineering firm Arup in Hong Kong certainly have. In February, they said that criminals used the technology in a sophisticated scam convincing a hapless employee to pay millions of dollars into bogus accounts.

The criminals allegedly achieved this by impersonating multiple senior figures at the company in an online call. Although details of exactly how the scam worked have not been confirmed, it’s thought that the voices of company management were ‘cloned’, allowing the criminals to sound familiar to the employee. It’s not thought that video - which requires more sophisticated technology - was involved.

The cloned voices – if that is indeed how the suspected criminals pulled off the scam – are an example of deepfake technology, which hand-in-hand with artificial intelligence (AI), is developing at breakneck pace.

Defined as synthetic media that have been digitally manipulated to replace one person’s likeness convincingly with that of another (whether their voice, their appearance, or both), the technology behind deepfakes is complex.

New tech, old tricks

But as online security experts Richard Hughes and Hodei Lopez at global IT services firm A&O IT Group explain, while deepfake scams themselves are something relatively new, they rely on a tried and tested formula involving social engineering.

Email phishing, which most people are now well familiar with, follows the same format. In the same way that an email that appears to be from a trusted company or contact asks the recipient to click on a dodgy link or fill in a form, so the deepfake enhances that sense of familiarity by creating a plausible impersonation of a real person known to the victim before taking advantage of their trust and goodwill to get the criminals what they want.

“We have slowly developed the ability to not trust phishing because it is so ubiquitous nowadays,” says Lopez. “So the latest approach goes one step further.”

If the idea of criminals being able to work out how specific people within a company look and sound and then create convincing likenesses of them seems far-fetched or difficult to pull off, then consider how much information about individuals is now available online.

“Our personas are very public these days, particularly if you are at the higher levels of C-suite,” says Hughes. “You are likely to have done presentations or speaking engagements on video. Your voice will have been heard. The more images you can get of a person, the better the video quality will be. The more samples of their speech, the better that audio quality will be. I could have a conversation with you and capture your voice. It’s not too difficult.”

While not necessarily hugely complex, deepfake scams do have to be well organised, he says. “A lot of ransomware gets sent out to a million people and one or two may respond – it’s not at all targeted. With deepfakes, criminals have to have a plan that they put into action and it certainly is targeted. That’s the difference.”

Construction companies are a target

If any more proof than the attack that Arup has fallen victim to is needed, construction companies are at least as likely to be a target for deepfake scammers as companies in any other sector.

Criminals are really only concerned with how likely their plan is to succeed and go looking for weak links, says Lopez.

“They will target environments that might be technologically less advanced, like construction for example,” he explains. “Within that environment they will try to find people who are possibly older or less technologically aware or they may target regions that are less technologically inclined.”

And in a fitting analogy for this sector, he adds, “If security is a brick wall, our job is to maintain the entirety of the wall. That’s difficult because there are lots of bricks - if they find only a single brick loose then they just pull it out.

“They might target a person who is close to a position of power – a secretary who might be a bit less technologically aware and willing to assist with a query of some sort. Then they use that empathy to steamroll the person.”

How to be on your guard

When it comes to how to protect your company against deepfake attacks, Hughes asserts that prevention is better than cure.

Lopez points out, “The big thing with all these sorts of social engineering attempts is they are putting you in a pressure situation: ‘I need this done now or we will lose business.’ Being told, ‘We need this done now’ makes you lose all the critical thinking skills that you have developed.”

Employees need to be encouraged to question what they are being asked to do. Meanwhile, management needs to make sure it is approachable enough that people feel they can query such requests.

credit card phishing - piles of credit cards with a fish hook on computer keyboard Image: weerapat1003 via AdobeStock - stock.adobe.com

At least when it comes to video, deepfakes may still offer telltale signs that they aren’t real, for now at least. Because a deepfake video involves overlaying a fake face on top of a real person in front of a camera, there will be blurring or artifacts that appear when the subject touches their head or face.

“But for less technical people, they are seeing a person they recognise and that all goes out of the window,” says Hughes.

In any case, because voice cloning (either text-to-voice, or live voice-to-voice) is more mature and less technologically intensive, scams are more likely to involve voice than live video, which is more complicated to pull off successfully. And it’s much harder to spot, particularly on a phone call where audio quality is generally lower, or background noise has been added.

Hughes advocates regular training on what to look out for. “We tend to think six months is about as long as you can go without needing some reinforcement,” he says. Longer than that and people tend to forget about the risks, he warns.

He also recommends posters around canteens and kitchens to reinforce the message, and even messages on company desktop computer backgrounds, changing every few weeks, if the business is large enough to manage those centrally.

Lopez also recommends reinforcing company processes with security in mind. “For example, if you are making a large financial transaction then maybe there should be an extra mechanism where it has to be verified by multiple people. Or perhaps there should be an extra step that a person in a leadership position has to take, like confirming by SMS, before it can go through.”

In the case of criminals asking victims to make payments, once the money has it their account, then it’s almost certainly too late, says Hughes. “These guys are pretty organised. If you pay cash into a bank account then it is probably gone in seconds after it hit that account.

“So it’s a question of first educating people to spot the threat and then second of all reviewing your processes to make sure that if you are being encouraged to make some form of transaction that you think might be a bit dodgy, then it’s harder to do straight away.”

He concludes, “It’s only going to get worse as technology gets better. Anyone who has a reasonable gaming machine can run a deepfake. We saw a massive upward trend in social engineering overall during covid and it hasn’t really settled down. It’s a business at the end of a day - one with a massive return on investment.”

Arup employee falls victim to US$25 million deepfake video call scam Engineering firm Arup has confirmed that one of its employees in Hong Kong fell victim to a deepfake video call that led them to transfer HK$200 million (US$25.6 million) of the company’s money to criminals

Form placeholder
STAY CONNECTED

Receive the information you need when you need it through our world-leading magazines, newsletters and daily briefings.

Sign up

CONNECT WITH THE TEAM
Andy Brown Editor, Editorial, UK - Wadhurst Tel: +44 (0) 1892 786224 E-mail: [email protected]
Neil Gerrard Senior Editor, Editorial, UK - Wadhurst Tel: +44 (0) 7355 092 771 E-mail: [email protected]
Catrin Jones Deputy Editor, Editorial, UK – Wadhurst Tel: +44 (0) 791 2298 133 E-mail: [email protected]
Eleanor Shefford Brand Manager Tel: +44 (0) 1892 786 236 E-mail: [email protected]