Credit-card fingerprint readers have been around for a few years — MasterCard (opens in new tab) began trials in 2017, and British bank NatWest (opens in new tab) in 2019 — but they don’t seem to have caught on with card issuers or card users. Samsung hopes its chipset will finally spur widespread adoption of biometric-enabled credit cards. The appeal seems to be that the chipset combines the fingerprint sensor, secure element and central processor into a single unit, rather than having them as three separate chips as exists on some other biometric-enabled payment cards. “With the three key functions integrated in a single chip, the S3B512C [integrated circuit] can help card manufacturers reduce the number of chips required and optimize card design processes for biometric payment cards,” Samsung said in a blog post (opens in new tab). This applies only to “chipped” EMV payment cards, which have been widely used in most of the world for more than 15 years. The U.S. began shifting to EMV cards around 2015, but the older magnetic-stripe swipe cards are still widely supported. In the same blog post, Samsung Electronics Vice President of System LSI Marketing Kenny Han pointed out that the chipset “is primarily designed for payment cards but can also be used in cards that require highly secured authentications such as student or employee identification, membership or building access.” Fingerprint-enabled credit cards are intended to cut down on theft and impersonation, as the fingerprint supposedly verifies the card user’s identity and “removes the need to enter a PIN on a keypad” or, in the U.S., a signature. Many fingerprint readers can be fooled by rubber fingerprint overlays, but Samsung said that its chip’s “anti-spoofing technology prevents unauthorized users from circumventing the security system with illegitimate methods such as artificial fingerprints.”