Most people recognize others through their faces, which is why deepfake purveyors focus much of their attention on mimicking facial features.

The Very Real Dangers of Deepfakes for Banks

Posted on

The term "viral" nowadays is associated with images, audio and video that captivate a social media audience and spread like wildfire. Their traction is usually driven by captivating, comedic or controversial content. Imagine if a video of you reached such status, but it depicted you doing something you didn't actually do – resulting in embarrassment or worse. You might proclaim your innocence, but if the person "captured on camera" looks like you, talks like you and displays similar mannerisms, it may well be an uphill battle.

That's what society is facing with deepfakes. From public figures to lawmakers to regular Joes, numerous people have been victimized by deepfake technology, wherein artificial intelligence and machine learning capabilities allow fraudsters to manipulate content, making it appear someone did or said something they actually did not. The speed of technological advancement has allowed this to happen and its proliferation is raising the ire and awareness of governments, business executives and financial institutions.

What is the origin of deepfakes?

While the quality and difficult-to-detect nature of deepfakes is fairly new — the term was coined in 2017 — the concept has been around since the dawn of the internet era. The overriding goal of these strategies is deception, similar to the purpose of photoshopping.

Deepfakes are much more sophisticated and convincing, leveraging computer graphics, synthetic media and neural networks so creations appear real. From Queen Elizabeth to Tom Cruise, to former Presidents Barack Obama and Donald Trump, people all around the world have been the subject of deepfakes, resulting in headaches and havoc for those affected.

"There are already proven instances of market-level disinformation campaigns using deepfakes,"

FIs and other business entities are recognizing the problems that this technology could — or, as many anticipate, will — create.

Sultan Meghji, co-founder and CEO of banking solutions firm Neocova, told BAI that deepfakes are becoming increasingly pervasive.

"There are already proven instances of market-level disinformation campaigns using deepfakes," Meghji warned. "Market or individual institutional manipulation has been seen as part of larger disinformation or manipulation campaigns by well-known actors. We've also seen instances of nefarious actors attempting to steal from banks using simple deepfakes."

Meghji further noted that when deepfake technologies are combined with scamming ploys, they may make it easier for bad actors to make fraudulent transactions, in effect stealing identities through tactics that are tantamount to virtual forgery.

It could also cause panic and confusion for banking customers. Writing for ATM Marketplace, Technologent Senior Solutions Architect Jason DeJong noted that one potential scenario could involve a faked video in which a bank CEO announces they're closing branches by a certain date.

"A video of this nature could cause irreparable damages to the reputation of the individual or the business," DeJong wrote.

Given the increased number of branch closures in recent years, such a scenario wouldn't be out of the realm of possibility, therefore customers might easily buy into it.

Can deepfakes be spotted?

Technology is in a near constant state of advancement, and as a result, deepfakes will become that much more difficult to detect as attackers and hackers refine their approaches. But according to security experts from the Massachusetts Institute of Technology, there are a few things you can look for to determine if the subject of a video is authentic or fraudulent:

Keep an eye on facial expressions

Most people recognize others through their faces, which is why deepfake purveyors focus much of their attention on mimicking facial features. But their efforts aren't foolproof, particularly when designing a subject's cheeks and forehead. The skin may appear unusually smooth or excessively wrinkled. The dimensions of the face — such as the distance between the nose and the mouth — may also be off.

Excessive blinking — or none at all

On average, people blink between 15 and 20 times per minute. Deepfake creations often blink considerably more than this, or do not do so not at all. Either way, blinking frequency can be a telltale sign.

Awkward or unusual facial hair

Again, deepfake technology is in its relative infancy so some facial features may not look natural. Facial hair is one of them. Goatees, sideburns, beards and mustaches may have a boxy or bulky look to them, rather than a natural appearance.

Be aware of glare

When light hits a glass surface, such as a window or pair of glasses, it reflects. Be cognizant of how light is depicted in relation to the individual onscreen. For example, does the angle of the glare shift as you'd expect it to when the person does? Is the glare unusually bright? Subtleties like these can speak volumes.

Quite apparently, targeting the risk posed by deepfakes will soon become one more component of your FI's overall security plan. From supporting your protocol to supplying you with the proper tools to protect your facility and personnel, BranchServ Convergint has what you need to build a better bank. Contact us today to learn more about our service solutions.