- Bunce
- Posts
- How employers can spot a deepfake interviewee (according to an expert)
How employers can spot a deepfake interviewee (according to an expert)
Interview with Rob Leslie, founder of Sedicii

As we speak, North Korean hackers are using deepfake technology to worm their way through interviews with remote-first UK companies.
We asked an expert how to spot them.
His name is Rob Leslie. He’s the founder and CEO of the identity verification company, Sedicii.
But that’s not the only reason why he’s an expert.
Last month, two industrious North Korean gentlemen got to the final round of interviews with Sedicii.
If they can do that with the experts, what about the layman?
Rob said:
“Trust your gut”
It starts with instinct, something that machines don’t have.
Rob said, “Humans are definitely better than AI at spotting anomalies that do not fit easy patterns.”
Science proves it, too. Humans are better at spotting deepfakes than humans, at one benchmark by 50%.
“Something didn’t feel right,” Rob said. “My instincts told me that right from the start of the interview.”
The hackers, posing as Estonian software engineers, had all the right credentials and references. But something was off.
“The person looked and sounded Chinese, but I was expecting a Caucasian Estonian man called Pavel.”
Who spoke with a Chinese-sounding accent, too.
“Pavel” explained that his parents moved from Malaysia to Estonia when he was 5. Rob thought, wouldn’t he speak English with an Estonian accent?
“If you were born in Malaysia,” said Rob, “it’s unlikely your name would be Pavel.”
There’s more than that, though. Rob said we must:
“Be paranoid and look for signals”
Some of these are obvious.
📻 Poor latency (a sign of deliberate distortion or bad internet)
💡 Inconsistent lighting
📺 Synthetic video feed & background
💋 Out of sync lips
🛂 Inconsistencies in passport photograph.
“The latency was bad, which isn’t usual for Estonia,” he said.
This might have been a deliberate attempt to distort the screen. In reality, “The person was in a place with bad internet.” North Korea’s internet isn’t exactly world-class.
The real giveaway, though, was the lips. “The lips on the person behind the screen were not syncing with what they were saying.”
The passport was AI-generated, too.
How to spot an AI-generated passport
Most AI-generated deepfakes pass KYC checks now. One Polish researcher created one using GPT-4, and it worked.
Rob had seen some high-quality AI-generated deepfakes in his time. “These looked similar in how they were laid out.”
Rob explained that when hackers take photos of fake passports, they place them on a sweater on a bed. AI generates images that look ‘normal’ to humans, who usually place passports on bedding or clothing before taking a snap.
This was one of them.
Here are the two passports:

The shadow on the left side along the edge doesn’t match the shape of the page.
The same is true for this passport:

Notice that the shadow of the passport on the clothing doesn’t match the angles of other shadows.
In the end, said Rob:
“Don’t trust anything you receive”
We all have blind spots.
Rob explained we must check and re-check our work.
“Get second and third-level confirmations.”
During the interview, Rob asked for references from their last employer.
These were the email addresses of successful CEOs in reputable companies.
The trouble was that they were Gmail accounts. “They should have been official company email accounts.”
Rob contacted the last referenced employer on his CV. The CEO didn’t know the man.
“At that point, I had confirmation that this was bogus and a scam.”
What technology is out there?
Deepfakes are getting better. But technology isn’t keeping up.
The criminals who make them know how they’re built, not us.
Intel developed FakeCatcher, which detects whether blood is pumping in the face. In a test, the BBC found it could tell what was fake, but not what wasn’t fake.
Test a deepfake using some simple steps
If you’re concerned that a remote interviewee is a deepfake, consider the following options:
🤭 You could ask them to pass their hand over their face. This disrupts facial landmark tracking.
🤣 Tell them to move their head from side to side. This causes the tracking system to spasm because it won’t be able to maintain accurate landmark positioning.
I admit, it’ll make for an uncomfortable interview.
But O, let me tell you, ye glorious beasts, making a North Korean deepfake hacker dance is something to tell the grandkids.