Russian ATM Scans Credit Applicants to Determine if They’re Lying
A new ATM for a Russian bank turns money machines into truth machines, using fingerprint recognition, 3-D face scans and...
A new ATM for a Russian bank turns money machines into truth machines, using fingerprint recognition, 3-D face scans and voice analysis to determine whether customers are worthy of applying for credit cards.
The Russian bank Sberbank plans to install the ATMs in bank branches and locations like malls, the New York Times says.
We’ve seen ATMs that scan fingerprints instead of magnetic cards, with a handful deployed in Poland last summer. But face scanning plus voice recognition takes it to a new level. In this case, it’s not necessarily intended to prevent identity theft — although fingerprint and face scans would help with that — but rather to prevent fraud by people with bad credit.
As the New York Times points out, it’s something we can imagine in the files of the KGB: It uses software to determine whether someone is telling the truth in response to questions like “At this moment, do you have any other outstanding loans?” It detects nervousness or “emotional distress,” the Times says, which could be indications that a credit card applicant is not being forthright. It can supposedly detect involuntary nervous reactions, much like a polygraph.
A firm called the Speech Technology Center developed their algorithms by listening to law enforcement databases of people who were lying during police interrogations. Perhaps fittingly, the Federal Security Service, the Russian descendant of the KGB, is one of the company’s clients.
The Times story notes that Russians already expect to be snooped upon, so they may not be as hesitant to bank with a truth-sniffer (or, um, an ATM) as an American consumer would.
Sberbank said the ATM is merely a guide — indeed, someone applying for a line of credit may have legitimate reasons to be nervous. A bank executive said it is no more invasive than checking someone’s credit history.
But is it better to trust a machine instead of a person when making a determination of human honesty?