Jordan: There is a terrifying thing that happens to our technology in an abusive relationship. All those devices that people use in the course of their everyday lives, from home security cameras to internet search histories, to GPS data, and find my phone features. Those become not useful information or helpful tech, but potential weapons for an abuser and the smartphone is potentially the worst of all. The latest report from Citizen Lab, a Canadian research center, calls it the predator in your pocket, and the organization has issued a warning about what’s known as stalkerware and how helpful it can be in facilitating cases of technology fueled violence and abuse. So what is stalkerware? How easy is it for an abuser to install it on the victim’s phone? How can you tell if it’s been installed on yours? What options do the police and the courts have to protect victims? And how are these programs able to be sold on app stores when they are marketed specifically at people looking to keep watch over their partners without their knowledge?
Jordan: I’m Jordan Heath Rawlings, and this is The Big Story. Kate Robertson is a research fellow with Citizen Lab and also a criminal lawyer. Thank you for coming in, Kate.
Kate: Thanks for having me.
Jordan: Can you start by explaining what is stalker wear? What does the term refer to?
Kate: Well it’s probably easiest to understand stalkerware as a spectrum of different types of technology that can be used in an abusive manner, and it’s often statistics tend to suggest that it tends to be abusive men who use technology in a way that allows them to either surveil without the knowledge of the target women, girls, and children predominantly, or to use technology in a more exploitative sense that’s overt and brazen in the sense that it’s used in a harassing manner, threatening manner or an exploitative matter.
Jordan: What does that actually look like? What kind;Can you give me some examples of what we’re talking about?
Kate: We’ve been looking at stalkerware from a number of angles, but when you think about it as a spectrum of different types of technology, it can really be surprisingly, a lot of different types of technology that we see and use in our day to day lives that we don’t necessarily associate as an abusive source of harm. And so when I say that I think about the different types of apps on our phone or functions on our phones for example, like a find my friends function work we think of to use to help us keep track of our phone quite literally. But even helpful and legitimate types of technologies like that can be repurposed for a more malicious end. And that, for instance, in the context of Find My Friend function, that could be geo location tracking and essentially monitoring at the very fine movement’s of an individual throughout the city where they’re going, what types of friends they’re associating, what kind of restaurants are they going to? For example, shelters to receive support, or a lawyer’s office to get legal advice, and even what kind of lawyers office? So you can imagine that our location is a very revealing type of activity in the context of ongoing form of surveillance or harassment, it could be very damaging and pernicious type of activity. But on the other end of the spectrum, we’ve been looking at a type of more malicious and powerful form of malware, which is essentially a program or an application an app as we refer to them, which can be quite easily installed, perhaps without someone’s knowledge on their phone, and it allows the person who’s operating this program to remotely monitor all of the person’s activities on their phone essentially. You think about the way that we interact with our phones in a day to day life, and it’s completely seamless, and so our texts or phone calls or emails, photographs, videos, all kinds of very personal and private types of data is coming and going on our phone, and these apps are particularly serious and disturbing in nature because it allows almost unlimited access to this data without the person’s knowledge or consent.
Jordan: You mentioned it’s pretty easy. How easy? If I gave you my phone, how long would it take you? What would you have to do to put some of this stuff on my phone?
Kate: Well, we found, and we’ve been looking at a handful of commonly available apps that are available at a consumer level, either online or common app stores that we all know and use. And so when you think about cybersecurity, for example, we often think about you know, the hacker in a foreign country that’s somehow found a hole into our phone, and they’re looking for some way to exploit our data. But often, phones and other types of cyber security measures don’t necessarily think about the threat from within your own home, and so if someone has essentially a two minute window of opportunity, say, while you’re in the shower or you know in another room, you know that’s probably about the time you would need to be able to install and download one of these programs if you have access to unlock it essentially.
Jordan: How detectable are they? Would I know? Like usually I download a new app, I see it on my phone’s desktop screen.
Kate: You know there’s a range of different types of programs, but some of the programs that we’ve been looking at in particular one of the essential problems with the business model, and the practice relating to these products is that they really go around what we understand to be commonly accepted laws and norms relating to data, protection data access, and that we need to have given consent to have data sharing with a 3rd party. So these apps operate in a manner, and have the clear potential, and marketed opportunity to provide the remote surveillor with access to data without giving any notice, so the apps can operate in the background of your phone. You never have any pop up necessarily that’s letting you know that your data is being sent to another phone and watched in that way. But certainly there are steps that can be taken, and this is often how some of these stories are coming to light is that perhaps they’re being used in a brazen way where the person’s expressly telling the person I have all your emails, I know where you’ve been and you start to gather and suspect through a variety of clues that someone might have access to your data. Because how else, for example, would they come to know certain pieces of knowledge and that kind of thing? Technically speaking, there’s ways to detect, you know, for someone looking at the phone, whether there are other signs that an app might be operating in the background in that manner.
Jordan: Like looking at the battery. You said stuff like that, I guess.
Kate: It’s hard to summarize a one size all fix for detection, and that’s certainly a complicated area, but you know, signs that you might have something operate your background is looking at whether other devices have had access to looking in the background and the settings on various apps that you might be using, whether there’s been an unknown device accessing it, whether your phone has been jailbroken and so steps that can be done to reduce the risk of this is you know, updating your operating system or enabling two factor authentication, which essentially means that whenever data’s access, there has to be two devices that you use and have entered the password into before data is accessed.
Jordan: So you mentioned that these apps, some of the ones that you’re looking at are marketed this way. Is that legal? This is why we’re talking to you. It doesn’t seem like I mean, some of the stuff you’ve already discussed is outside the law. So how are these apps created? Who’s making them?
Kate: We certainly haven’t exhaustively looked at every single app provider that’s available in the marketplace, but we’ve certainly, we’ve looked in particular in the citizen lab reporting and research, eight companies in particular that offer services like this and part of the findings of our work has been that the advertising practices sometimes and quite often, six of the eight companies expressly in some form had monitoring of a spouse or intimate partner as part of their advertising package, and so that’s certainly not the entire market but it’s It’s certainly something that companies seem to offer as part of their service. And so that raises a number of very important questions that you picked up on regarding the legality of their products, as well as how they’ve made them available in the marketplace. And we had two companion reports that were released this week and one of those looks at ecosystem itself, including, for example, their advertising practices, but also a really holistic picture of the Canadian laws that might apply to this type of commercial activity, and you know, it’s funny because if you think about well, you know what we’ll go set about identifying all the different types of laws that could apply to this, and it wound up being a much bigger endeavour than we’d initially envisioned because there are truly so many laws in Canada that do have the potential to apply from a criminal perspective, from the perspective of civil law remedies that might be available for the victims of this type of behavior, as well as the role of various regulators in Canada who might have a mandate over different product safety from a consumer level issues as well as privacy legislation.
Jordan: So you mentioned that all these things exist. Are any of them being used or applied now to this kind of malware?
Kate: That has been one of the major findings that there, even though we didn’t identify, actually very many gaps in the law itself, there’s a huge disparity between what the law says about this type of service, If you could call it that or these types of products and how the law is responding. And I would say that it’s responding little, if not at all….
Jordan: I feel like this is a common theme when we talk about crimes that occur online and the Canadian legal system.
Kate: Well that’s a fair comment, and I would say that over the years we’ve definitely seen movement towards responding to the ways in which crimes, and abuse can be perpetrated in digital landscapes, and that’s not necessarily unique to the criminal law. Often law will try and keep up, but with modest success with technological change. When it comes to for example, this very specific type of malware that we’ve been looking at, we haven’t been able to find any case that has landed in a criminal court relating to the use, the development, or the sale of this type of product. And that’s certainly not because the criminal laws and Canada wouldn’t clearly apply. We have outlined in our report how those laws have the clear potential to apply not only two individuals who use these apps in a malicious way, but also the companies that develop and create the programs, as well as those who have offered for sale and profit off of it.
Jordan: Did you find any cases where it was brought to the attention of the authorities, and then no prosecution happened?
Kate: That was part of the purpose in endeavoring in this area of research is that although over the years we’ve increasingly come to know of the extent, perhaps you could call it a tip of the iceberg in terms of our knowledge of the problem, and it’s really hard to say for sure how big the iceberg is because we’ve heard about it through, for example, investigative journalists, other civil society actors who’ve been doing research in the area as well as in particular, front line support workers who have worked with, for example, women and women shelters where they go for support in the context of domestic violence and abuse. And anecdotally, we’ve heard frequent reports of these types of concerns being raised and sometimes being brought to the police, but have to only skip to the end of the story cause I don’t know everything that happens in between that we don’t have known cases as to how they’ve been dealt with, and so there’s any number of reasons that could explain why a case may or may not be acted on but it’s not something I can tell you or your audience, because we just don’t know. But that’s something that we’re hoping to shed light on in this research, because part of the issue is certainly making sure that people understand that this is a problem that is part of a broader, more systemic problem with respect to technology facilitated abuse against women and girls and gender based violence more generally, and so while we can certainly look to different technological fixes in the ways that the private sector can mitigate the problem it’s certainly something that needs to be looked at holistically not only by law enforcement but other policymakers, legislatures and regulators in Canada.
Jordan: When you talk to people like shelter workers and other folks who are on the front lines of domestic violence, partner violence in Canada, what do they tell you about this problem specifically? Is it something they’re seeing more of? Is there a general awareness of it? Because I don’t feel like there is in the general population.
Kate: Well in terms of the statistics and research that we do know of in Canada per say, that has come from frontline workers themselves and the support context. And so there has been some studies in Canada in terms of collecting the reported experiences through frontline workers, and certainly that’s an indicator of the problem, that’s also an indicator that further research would be useful and necessary in order to understand the full scope of the problem, how these cases are being handled, and the prevalence in Canadian society. But what these workers are reporting is, I would say primarily they struggle with the feeling of being quite overwhelmed with the problem because they seem to report that numerous people are coming to them with this type of problem. They’re holding a phone, they don’t know whether they can trust their own phone or how to fix and solve the problem, and certainly support workers and social workers are not trained to be tech experts, and so they themselves don’t know necessarily what advice to give them, and so it’s certainly an access to justice problem, and that we don’t have obvious solutions. When you think about it, when you have a concern about your phone in our day to day life, if we need some type of support or help, one of the first places we go to is our phone, and so it can create a really catch 22 in a sometimes even dangerous way, in that if you’re Googling, or making calls, or going to locations that are reflective of your knowledge and insight into the problem, it has the potential to put you at risk if the person is monitoring your own growing awareness of the problem, and so that’s something that certainly support workers have identified is something that they struggle with is what to recommend individuals do in that situation.
Jordan: So what did you and Citizen lab come out of this research recommending? Were there some key things that really need to happen?
Kate: One area certainly is that phone providers and designers as well at providers and designers have their own responsibility as individuals that are creating these programs and software to turn their minds to cyber security from the perspective of individuals who are in vulnerable communities and who are vulnerable to this type of harm and abuse. And so when we think about cybersecurity, we can’t only think about it from the perspective of the foreign hacker, but also the surveillor from within our own home, and so there’s a number of ways that apps and phones can be designed to better protect us from this type of malware for example. And so, the private sector has an important responsibility that certainly includes these companies who are expressly making these products available in the marketplace without, for example, having even identified in their websites or online information people who’ve been at the targeted and victimized end of their own products, how they can remedy the situation, for example. And so that’s a clear oversight, and we’ve identified a number of ways that their business practices violate Canadian laws in that regard including privacy legislation, but in terms of other actors in the legal system, law enforcement and other regulators who have the opportunity, and the laws available to them to investigate this type of issue not only at the one on one individual level, but also or systemically we’ve identified a number of areas that might work towards the goal of cutting the problem off more at the source. So the company level or, for example, even the idea that class proceedings could be looked at as a form of securing some of the civil remedies that might be available in this context, where you do have some of that first mover problem where the victims of this type of activity may not even know that they’re victims in that way. And then, you know more broadly rather than a specific fix or recommendation one of the most important ways that we identified moving forward and working towards fixing the problem is recognizing that the problem exists and connecting it to that broader source of gender based harm that does exist in our society, and so recognizing the problem is, as they say, the first step to recovery.
Jordan: Is this something you’ve seen outside of Citizen Lab as a criminal lawyer? Is it showing up in the courts now?
Kate: It certainly is. And I’ve had a number of different opportunities to observe this problem in the course of my career so far, and a number of ways that the courts are slowly starting to respond. For example, when I first started out I worked as a crown prosecutor, and there would be some times instances of technology facilitated abuse that would perhaps come up in the context of a set of facts that related to another, more traditional type of offense, like assault or threat, and these facts would essentially be read in as what they call as aggravating facts. But they wouldn’t necessarily be the subject of more specific charges, and so you can see from that perspective a window into how other types of domestic abuse, whether it’s harassment or domestic violence and assault often goes with other types of technology facilitated abuse, and while we haven’t seen criminal cases that relate more specifically to this type of malware that we’ve been researching, we have seen other types of technology facilitated offenses that have been increasingly acted upon by law enforcement as well as prosecutors and the courts, and so a good example of that is the offensive, non consensual distribution of intimate images, which is a bit of a mouthful. But it’s commonly….
Jordan: AKA revenge porn?
Kate: That’s correct, and that label itself has a number of problems associated with it but it’s certainly how it’s commonly known, and that’s something that was literally not an offence in Canadian law until there was an amendment that did fill that legal gap, and that’s something where we’ve now seen many, many cases going through the Canadian courts.
Jordan: The first thing I thought when reading the report and listening to how you discuss the way these things are made is is this a problem that stems from a crazily male dominated field, making apps, marketing apps, the tech sector in general?
Kate: well, that’s certainly something that we as a society and as security rate researchers have certainly been trying to shine a light on is that diversity in the profession is a very important thing because issues like this just simply aren’t necessarily crossing the minds of individuals who have the privilege of not being afraid of being the target of this type of abuse, and so that’s something that certainly needs to be addressed as a company in the private sector level.
Jordan: What should someone who might suspect has malware on her phone or even who confirms it, what should the next step be? Would you recommend they go immediately to the police? That they seek help in a shelter? What’s out there?
Kate: Well, I have to give it a somewhat nuanced answer to this question because part of the risk is associate with the fact that they’re perceiving that their phone may be infected already, which means some of the data that they might create by virtue of trying to solve the problem might fall into the hands of the person who they have identified a safety concern in respect of and so if, for example, you’re being tracked in terms of your geo location, going to the police station will become potentially a known fact, and so risk management is entirely a core part of whatever recommendation individuals might receive when they’re speaking to support workers, or going for other sources of support, and that problem can be managed and and should be managed, and other actors need to take responsibility for making these pathways to safety available, because we certainly can’t leave the burden of this problem on the shoulders of the individuals who are experiencing this type of harm. But in terms of steps that could be taken for example, one starts at the phone level in terms of updating your operating system, changing all of the passwords in your apps and on your phone, and enabling, for example, two factor authentication, which helps mitigate some of the risks that are present in the consumer marketplace from these apps. And I’ve heard some recommend, for example, that at the end of the day, if someone is, their safety is at risk, the police need to be a source of support in order to secure their safety, and so that’s going to be something that at the individual level, people will weigh and measure. But another step that someone might think about taking is taking the phone to the actual phone company, for example, a Rogers store and getting sources of tech help that might not otherwise be available, and it’s not necessarily an obvious clue why you’re going to the Rogers store, for example, and so creative solutions are unfortunately what are necessary in that type of problem.
Jordan: Thanks for this Kate.
Kate: Thanks for having me.
Jordan: Kate Robertson, a research fellow at Citizen Lab and a criminal lawyer. That was The Big Story. If you want more, you can go to thebigstorypodcast.ca. You can also find us on Twitter @thebigstoryfpn, and us and all our fellow shows at frequencypodcastnetwork.com. Don’t forget to rate us and review us and subscribe for free wherever you get podcasts on Apple, on Google, on Stitcher, on Spotify. Thanks for listening. I’m Jordan Heath Rawlings, we’ll talk tomorrow.
Back to top of page