Last week, Tinder rolled out a new feature that will allow users to pay $2.50 to check if matches have certain types of criminal records associated with a sex offender registry, arrests or convictions for “violent or harmful behavior.”
The company partnered with public record aggregator Garbo to help users make more informed choices about who they interact with online.
Some experts are concerned that use of this specific data may not be the best way to improve user safety. Sarah Lageson, an associate professor at Rutgers University-Newark School of Criminal Justice, shared her concerns with Marketplace’s Kimberly Adams.
Sarah Lageson: Garbo has done the enormous task of collecting criminal record data from probably over 3,000 different county level jurisdictions. And they’ve used an algorithm to code these criminal records and flag the types of criminal records that they think their users would be the most concerned about, so violent or sexually violent offenses. When you run a person’s name, you’ll get results, categorized as records that are the best matches, and perhaps the most accurate all the way down to records that are a more loose match and perhaps less accurate. And so it offers users a way to sort of screen potential matches, but it is using data from the criminal legal system which has its own set of problems.
Kimberly Adams: What are those problems?
Lageson: Criminal record data is notoriously incomplete. Different jurisdictions categorize crimes differently based on their state law. Unfortunately, a lot of states don’t have the infrastructure for great data matching techniques. So sometimes people’s records don’t show up when they do have a record or other people’s records will show up in a search for their name or their birth date. And then there’s a problem of records that have been sealed or expunged still showing up on court records or showing up on private platforms like this. Bigger picture, you know, we have a lot of biases in the criminal legal system that are reflected in the data. And that comes from the over-policing and over-prosecution of certain communities, especially communities of color. And so the problems that we might recognize in the criminal legal system, more broadly, are definitely reflected in the data that we use in these other settings, such as online dating.
Adams: And how is Garbo addressing those concerns?
Lageson: Garbo is not going to report types of low-level offenses that are unrelated to dating safety, like marijuana offenses or loitering offenses. My research has shown though, that people can sometimes have a “violent arrest record” or a “violent charge” because of police decision making or overcharging by prosecutors as a way to influence a person towards a plea bargain. And so when we’re using non-conviction data, we sort of run the risk of using data that’s based more on the legal-system decision making, rather than the actual safety of a person. And that could mean that people put a lot of faith in these background checks without sort of understanding how the legal system works.
Adams: Garbo is considered an aggregator and not a background check company, necessarily. How does that affect the regulations that are associated with it?
Lageson: Yeah, that’s sort of a tricky question that the courts are struggling with right now. And so a background checking company, of the type that you might use if you have applied to a job, for instance, is regulated under the Fair Credit Reporting Act. So there’s a set of rights that the subject of the record will get, such as getting a copy of it or being able to contest inaccurate or outdated information. And that’s because we know that that record is being used to make a decision about somebody in like a business context. Now in the era of big data and the availability of public records, another set of companies have used records as a data aggregation tool. And there they’re governed under Section 230 (of the Communications Decency Act). The idea is that they’re taking data that’s coming directly from other sources. And it’s up to those third parties to make their data as accurate as possible, not the data aggregator. And so this is why people in my field have concerns about the accuracy of data being aggregated at this sort of scale and what that means for the individual person who’s the subject of the check.
Adams: Do you anticipate any sort of privacy pushback from users of these dating apps?
Lageson: Garbo has said that they won’t publish the home address or other sort of identifying details on their platform that you often see on people search websites, where the address and the mugshot and all sorts of other information about a person are posted alongside their criminal records. And those sorts of websites certainly pose a lot of privacy problems and leave people vulnerable to doxxing or stalking or harassment. So Garbo is thinking about privacy in that way.
Adams: How do you think adding this type of feature will impact user safety on maybe even other dating apps that might roll out this type of similar feature in the future?
Lageson: I think that in general, in the information age, we really wish we could have perfect information about everybody that we interact with. And so when we have tools like this, it sort of helps, you know, scratch that itch. The problem is that, you know, this is not a silver bullet. And Garbo has been pretty clear about that on the platform. You know, reminding people that there might be missing data, reminding people that most people who are sexually violent do not have a criminal record and so they won’t show up on the platform. And so we can’t just rely on a background check showing whether or not someone has a record to necessarily keep us safe.
Related links: More insight from Kimberly Adams
My colleague Meghan McCarty Carino covered this story as well, specifically looking into why this is more than just a safety feature, especially for women.
Mashable has a round-up on Tinder’s new feature here.
Garbo has its own statement on the types of offenses included in the background checks and the ones that will get left out of a report. The company says it has tried to strike “a balance between privacy and protection.”
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.