Sarah: Think about the last time you signed up for a dating app, maybe you were just there to play, check out the field, swipe left and right for fun? Maybe you felt excited or hopeful about the people you’d meet? The romance or hookup on the horizon? Maybe you felt vulnerable? The point, after all, is to kind of put yourself out there. But for queer dating app users outside of North America, the word vulnerable, is a massive understatement. Their lives are being put at risk. A new report from cybersecurity company Recorded Future found that LGBTQ people face international targeting, surveillance, and censorship in Russia, Eastern Europe, the Middle East, Asia, Latin America, and Africa. Among the tools enabling all of this? Data, harvested from popular dating apps, usually without the user’s consent. Recorded Future found security risks in apps including Tinder, OkCupid, Grindr, Scruff, and Her, which build data sharing into their business models. Even as these apps start to offer things like notifications when a user is entering a country with anti-homosexuality laws, information can be widely shared without their knowledge and used for nefarious purposes. I’m Sarah Boesveld, sitting in for Jordan Heath Rawlings, and this is The Big Story. To help us unpack all of this, we’re talking with BuzzfeedNews reporter Jane Lytvynenko, who wrote about this report last week. Jane, thanks so much for being with us today.
Jane: Hi, thank you so much for having me.
Sarah: Okay. So many of us by now are at least somewhat aware that we hand over a whole lot of data that we don’t voluntarily do in an active sense, but what is going on in the entire marketplace? Like, what’s sort of underneath that surface level sharing that we’re really not aware of day to day.
Jane: Right, so a lot of this report focused on dating apps. So let’s look at those in particular. And this report looked at dating apps that were both not specifically for queer people, but that queer people use, so like OkCupid, for example, non-queer people use it all the time, and same as Tinder. But then there’s Grindr, there’s Her, there’s Scruff that they’ve also dug into. What the base knowledge is, what we need to understand, is that most of the apps, their business model is to sell advertisements, just like it is with anything else on the web, with Facebook, with Twitter, with Google, right? You get sort of advertisements that are targeted to you as a person, but in order for those advertisements to be targeted effectively, they need to collect a fair bit of data in terms of who you are as a person, what your interests might be Who might be wanting to sell things to you in the first place. And it’s that data collection that really concerns the authors of the report at Recorded Future. That data collection means that anything you provide to a dating app will likely be accessible to advertisers. For instance, I think on OkCupid, they ask you questions about yourself to determine how likely you are to match with somebody. And it’s fun, people have tried to hack that algorithm in all kinds of different ways, but these questions can be anything from like, what’s your favorite movie, to like, have you used hard drugs before, right? That data can then be passed on to advertisers and data brokers. Part of the reason why that is so dangerous is because data brokers who collect this data, collect this data from many different places. They collect it from many different apps, many different places on the internet in order to be able to target you better as a person. So even if what you provide to the dating app is extremely minimal, you know, you haven’t linked it to any other social media, you haven’t provided your image, you haven’t provided your phone number or your email address, the data that’s collected can still be used to identify you, right? And that is done through a process– this is going to be a technical term, but bear with me– called inferred data, which essentially means, okay, here’s everything that we know about a person, and here’s who we think they are.
Sarah: Wow. So they get to make a composite of you in some ways that is not exactly, maybe, perfectly you, but it’s their idea of who you are?
Jane: Right, exactly. And inferred data is not data that you have provided. It’s not something that you’ve consented to handing over. It’s something that advertisers, as they collect all of these digital trails that you’re leaving behind on the Internet, they’re collecting it. And they’re saying, okay, here’s who we think this person is, in order to be able to target ads at you. Now this becomes really, really dangerous when we talk about vulnerable populations. And this report in particular focused on the LGBTQIA populations. But of course you can see how other vulnerable populations can be targeted with this as well. And what they found is that some apps will scramble your location data, which means they won’t know quite where you are. Tinder has introduced a little pop up to say, Hey, you’re in a country where queer people don’t have the the best rights. Be careful. But of all the ops that they’ve examined, it was only Scruff, I believe, that didn’t collect enough data to put its users at risk.
Sarah: Wow. There’s a whole lot there. I would love for you to back up just a little bit more and talk with me a bit about the way that they gather information from the dating sites specifically, because I feel like, you know, maybe it even takes people that extra leap to join a dating site and app, you know, download it on your phone. It’s a vulnerable experience, I mean, emotionally. And then you do need to provide the right information in order to get that match that you’re looking for. Maybe that is facial recognition that people, if they have an updated phone, that’s something they want you to do as the very beginning, right? When you set up your phone? Tell me about the sort of intimacy of some of that information that they use over a dating app that will end up putting you possibly in a vulnerable place thanks to these dark web actors, I guess, or not even as dark as–
Jane: Not, yeah, not, no– we’ll touch on the dark web later, but that’s a whole other thing. So right now let’s say, with the advertisers, which is, you know, a legal business, there’s nothing about it that is against the law. But usually when you sign up to a dating site, and everybody’s experience is different, but I know that when you sign into Tinder, for example, it asks you to link to your Instagram account or your Facebook account, which means that Tinder now has all of your friends, right? It has your entire friend network. So it’s tricky to know how the apps show you the people that you’re matched with. Usually this is done through proprietary algorithms, and we don’t know how those matches are made. But what we do know is that if you consent to providing your friends to Tinder, that is probably a data point that they’re going to use. So, and same with OkCupid, same with any other app. And from there, if you upload photos, the app will have those photos forever. If you change your bio to explain who you are, that bio will also, you know, be collected. We can think about it based on an individual level, and like what we can do to protect them ourselves. But I think one of the things that’s really striking in this report is, no matter how safe you’re trying to be, it’s nearly impossible. It’s nearly impossible to use the web in an anonymous way. And so the example that really stuck with me when I spoke to the researchers about this, is if you provide your HIV status to the app, that information could be collected as well. And I don’t know if that’s a piece of information that people want sold to advertisers at that point.
Sarah: Well, speaking of the intimacy of it, when you were talking, I was thinking a lot about how social media, and the Internet, and maybe even dating apps for sure, have allowed people to live out loud in a way that was really a little bit more difficult or different before, you know, you are– your identity is so much of an online presence. And it is striking to me that you can’t even necessarily be anonymous, even if you don’t do that online, you can still be found out in so many ways. It’s kind of like, for queer communities, nonconsensual outing to actors who might not have your best interests at heart, right?
Jane: It totally could be. It totally could be. And even if you provide, you know, a brand spanking new email address, you don’t use her name, you don’t use your own photo, if, for example, you don’t use a VPN, which doesn’t show the advertisers which internet address you’re logging on to, which IP address you’re logging on from, that can be used as well. The IP address can be used to say, okay, what else has this IP user done on the web? So if you’re logging on from your home or your wifi network, that is also information that you need to be really careful with.
Sarah: So could they find like porn sites that you might’ve been to or other people that you’ve connected with, that they have more information about?
Jane: It’s tricky to know. You know we, in many cases just don’t know what data these advertisers have. I believe in the report, they say that some ops share this data with over a hundred different data aggregators. And we just have no idea what data they have collected on us. And a lot of that data could be wrong, right? A lot of that data could misidentify who you are, could misidentify what it is that you do on the web. The information that they hold, it’s very, very obscure. And it’s incredibly tricky to figure out what it is that they know. And if you do want to find out, it is a multistep process that I’m not quite sure what the process is, to be honest with you.
Sarah: Well and Jane, you’re somebody who is well versed in this area. And to be at least aware of a multistep process is probably a lot more than most people know. It sounds like you have to be, like, a devoted detective and spend a lot of time and have a lot of connections and resources to figure this out. It sounds horrible.
Jane: You need to be fairly tech savvy, right? And if you’re somebody who doesn’t necessarily understand the full, like, structure of the web and the complexities of it, and how different apps talk to each other, or how different data providers talk to each other there’s a lot of speed bumps on the way if you want to understand what is known about you on the internet. Something– when I was speaking with the researchers– that stood out to me, was about the app Grindr, which is primarily targeted at gay and queer men. And Grindr is a US-based app that at one point was sold to a company in China. And this freaked the US out so much that they essentially said, no, you cannot be owned by China because it’s a national security risk. And something that one of the researchers said to me was, if the US government is worried about how much data is being collected on you, then you should be worried too.
Sarah: Let’s drill down on exactly what the risks are for LGBTQ people, given that they are a vulnerable group and that these apps are really not equipped to kind of help reign in a lot of this data.
Jane: Yeah, so in order to understand, sort of the worst case scenario that we are looking at, we could be looking at the researchers compile different cases around the world of what LGBTQ people have had to deal with, both from their home countries and from just bad actors on the web. One of the things, one of the scary things to me, is that a lot of these apps were used to entrap queer people to essentially pose as a friendly face on the app, where in reality it wasn’t. And target queer people all around the world. And there’s no area in the world where this hasn’t happened. This has happened in Eastern Europe, in the Middle East, it’s happened in Asia, and in Latin America. This is a real, real risk.
Sarah: So entrapment. Essentially, it’s kind of like catfishing, the way we understand that they create a fake profile, chat you up, lead that relationship in a certain direction, and then do they offer to meet in person and then it’s a police officer or something? Is that what entrapment looks like?
Jane: Yeah, yeah. That’s essentially the process. And, you know, with a lot of this stuff, we don’t know what we don’t know. And so there’s a lot of sort of nightmare scenarios, where there’s a combination of the risks that queer people face online, the risks that their data is being collected and viewed by governments, and there’s a possibility that governments will target them.
Sarah: And they can be arrested, beaten, what happens on the other end of A) that entrapment, and B) that data harvest?
Jane: It varies from country to country, how deep the threats are to queer people online. So with Russia, we know that government has been targeting LGBTQ activists online. We also know that the gay purges in Chechnya were partially because of social media posts, and were partial because gay people’s phones were seized and their contacts were gone through, and used to bring in for beatings, for torture, for imprisonment. As a matter of fact, I’ve spoken to a gay man who escaped from Chechnya, who that happened to. His phone was seized, and the contacts that he had on the phone were targeted, as was he, with imprisonment and torture. But. Russia, of course, is not the only country that this has happened. Recorded Future pointed to a study by a human rights organization called Article 19 that showed that in Egypt, Lebanon, and Iran, they’ve all used apps and social media to track and entrap queer users. Back in 2014, the Israeli army used a mix of surveillance and real life entrapment to target queer Palestinians who they felt they could then blackmail in order to be informants for them. So, the situation is quite grim, and it’s only getting worse, because the data processing technology of trying to figure out who’s who on the internet, and facial recognition technology, are all getting more powerful, and more and more widely used. And this is true within North America and outside of North America. Earlier this year, we saw the first case of facial recognition technology falsely pointing to a man as a perpetrator of a crime that he did not commit. And he was a Black man. So these technologies are just getting worse and worse, and the vulnerable populations are particularly at risk.
Sarah: Wow. I mean, facial recognition is also just becoming so the norm, you know, I think a lot of new phones just have facial recognition when you sign up. I have not– I’m not letting them have my face at all. No thank you. But it’s like, it’s a convenience thing. They market it as a convenience thing. Just give us your face, you know, and then you don’t have to tap in your passcode or whatever.
Jane: And I would be– you know, I think that on the subject of facial recognition, the twofold issue is, when we join these apps and these social media websites, we’re not quite sure what we’re consenting to. I don’t know a single person who’s read the full terms and conditions, right?
Sarah: They make it really easy, don’t they?
Jane: Yeah, yeah. And then if we do go the round-about route of not consenting to say facial recognition technology, that doesn’t mean that another app can’t go in and still use facial recognition technology on us. And this is true in Canada as well. A lot of this extremely powerful data gathering, extremely powerful data processing is done without our consent.
Sarah: All the time. So nonconsensual activity, human rights being infringed upon, is there anything that we can do about this? Because it feels so futile, particularly given that it still happens, even if you don’t consent, right? It’s out of our hands in that way. What can be done, just on user level, but also legislatively and what companies should be doing to lock down on this?
Jane: So on an individual level, beyond being extremely, extremely careful and mindful of what you put online, there’s sort of a limited amount of things we can do. So one thing to be aware of is, if you put a disappearing photo, say on Instagram stories, it’s not disappearing, right? Instagram still has that photo. So, and same with Snapchat, for example. So be careful with what you put online, and be careful with who you put online. I know a lot of parents opt out of putting their kids’ photos online. If you’re, say, at a protest, or if you’re in the company of people who are vulnerable, just be careful with putting their faces, their names, on the internet. But you’re right to ask about legislation. In the US there are several cities that have banned facial recognition technologies. San Francisco is sort of the primary example of that. And it really is on legislative level, that I think we need to be thinking okay about this. Facial recognition technology is often wrong, just like the data that we talked about is often wrong. So as people are thinking about sort of what they can do, especially in this moment of extreme democratic engagement, think about queer people, think about people of colour, think about vulnerable populations, and maybe ask whether your lawmakers are addressing this issue, are studying this issue, are looking at this issue. And one thing that I want to mention, because this report focuses outside of North America and I don’t– I want to be really mindful of that, we really need to be thinking about asylum laws and how we protect vulnerable populations that need a safe place to go as well. There’s very little we can do to be able to help queer people overseas or queer people in other countries. But those asylum laws are really key to be able to offer a safe haven to anybody who needs it.
Sarah: Yeah. You know, Canada likes to sort of position itself as a welcoming place, but we can really take action on this and look at our data legislation. Thank you so much for coming on Jane.
Jane: I’m sorry it’s a bit of a grim picture today.
Sarah: Take care.
Jane: Thanks for having me.
Sarah: Jane Lytvynenko is a reporter with BuzzfeedNews. That was The Big Story. For more from us, you can visit our website, thebigstorypodcast.ca, or you can follow us on Twitter at @thebigstoryFPN. And now you can write to us to let us know what you think or tell us what you’d like us to tackle in future episodes. Our email address is thebigstorypodcast@rci.rogers.com. I’m Sarah Boesveld. I’ll talk to you soon.
Back to top of page