Dating apps fight against scammers in romantic relationships

1 year ago
tgadmintechgreat
149

Michael Steinbach, head of global fraud detection at Citi and a former executive assistant director of the FBI’s homeland security division, says fraud has broadly moved from “massive card theft or simply getting that much information very quickly, to more sophisticated social engineering where scammers spend more time spying.” He adds that dating apps are just part of a global scam, and scams still take place on a large scale. But for scammers, he says, “the reward is much greater if you can put in the time to gain your victim’s trust.”

Steinbach says he advises consumers, whether on a banking app or a dating app, to approach certain interactions with a healthy dose of skepticism. “We have a catchphrase: do not pick up the phone, call,” says Steinbach. “Most scammers, no matter how they do it, contact you without asking.” Be honest with yourself; if someone seems too good to be true, they probably are. And keep communicating on the platform—in this case, a dating app—until true trust is established. According to the Federal Trade Commission, about 40% of romantic scam loss reports with a “detailed description” (at least 2,000 characters long) mention moving a conversation to WhatsApp, Google Chat, or Telegram.

Dating app companies have responded to the surge in scams by deploying both manual and artificial intelligence tools that are designed to detect a potential problem. Some Match Group apps now use photo or video verification features that encourage users to take pictures of themselves directly in the app, which are then run through machine learning tools to try to determine the account’s validity, as opposed to someone uploading previously captured images . a photograph that may be stripped of its speaking metadata. (A WIRED dating app scam report from October 2022 indicated that Hinge did not have this verification feature at the time, although Tinder did.)

For an app like Grindr, which caters predominantly to LGBTQ men, the tension between privacy and security is greater than other apps, says Alice Hunsberger, vice president of customer experience at Grindr, whose responsibilities include overseeing trust and security. . “We don’t require a photo of every person’s face on their public profile because many people don’t feel comfortable having their public photo online linked to an LGBTQ app,” Hansberger says. “This is especially important for people in countries that don’t always accept LGBT people or where it’s even illegal to be part of the community.”

For large-scale bot scams, Hunsberger says, the app uses machine learning to process metadata at the time of registration, relies on phone verification via SMS, and then tries to identify patterns of people using the app to send messages faster than real human power. When users upload photos, Grindr can detect when the same photo is being used over and over again across different accounts. And that encourages people to use video chat within the app itself to try and avoid being scammed by catching fish or butchering pigs.

Tinder’s Kozoll says that some of the company’s “hardest work” involves machine learning, though he declined to share details about how the tools work because attackers can use the information to bypass systems. “As soon as someone registers, we try to understand if this is a real person? Is this a person with good intentions?

Ultimately, however, the AI ​​won’t do much. According to Steinbach, people are both scammers and the weak link on the other side of the scam. “In my opinion, it comes down to one postulate: you must be aware of the situation. I don’t care what app it is, you can’t just rely on the tool itself.”

Leave a Reply