CLIP
You are listening to a Frequency podcast network production.
Jordan Heath Rawlings
A warning before we begin, today’s episode of The Big Story contains discussion of child sex abuse materials.
Jordan Heath Rawlings
I thought I understood where the really ugly and illegal stuff was lurking online. The dark web that’s where the stuff like child sexual abuse materials hides. Under layers of anonymity, scrambled to keep it out of sight of the authorities, kind of place where you have to know where to look. That’s what most of us assume. But if you could see the take down notices that a Canadian project aimed to catch this stuff issues, you would be shocked. This is not the dark web. This stuff exists on the platforms you use every day. As our guest puts it, it’s hiding in plain sight. And yes when it gets found, it is usually taken down, but two points about that. The first is that I said usually, which is a whole awful problem unto itself. And the second is that it really does have to be found. By this project or by somebody else. And the takedown notice has to be issued until it is the site hosting this material, which again, I won’t describe, but it’s the worst stuff you can think of. The site isn’t technically responsible, so how is that possible? Let’s get into exactly how this works.
Jordan Heath Rawlings
I’m Jordan Heath Rawlings. This is The Big Story. Jacques Marcoux is the Director of Research and Analytics for the Canadian Centre for Child Protection. He oversees and directs organizational research related to the sexual victimization of children. Hello Jacques.
Jacques Marcoux
Hi.
Jordan Heath Rawlings
Why don’t you just start by telling us about Project Arachnid. What is it? When did it start? Why did it start?
Jacques Marcoux
So, Project Arachnid, it’s software to help fight back against the proliferation of child sexual abuse imagery on the internet. So, this is a project that my organization launched in 2017. We have a team of developers, software engineers who write code to create crawlers that crawl the open web in search of images and videos of CSAM. And so, by open web, I mean like this is, we’re talking about the publicly accessible internet, but specifically in sort of the corners of the web known to be havens for child sexual abuse materials. So, we’re talking about the dark web child abuse forums, these like chan style forums.
Jacques Marcoux
And even the regular web. So, we receive suspicious links tips from the public, for example, which are often found on mainstream social media platforms. So, a lot of this stuff actually also hides in plain sight. And so, just generally the way it works is these crawlers, we deploy them when they encounter imagery, we extract the files unique digital fingerprints, so every file, just, just like humans, have a unique digital signature. And so, then we bounce those off of databases of fingerprints from known CSAM that’s been confirmed by law enforcement or by NGOs like us. And when we find a match, our system will issue a takedown notice to the website or hosting provider.
Jordan Heath Rawlings
And how often does this project detect these things? And, and just for anybody who’s not aware CSAM is child sexual abuse material, like how frequent is this.
Jacques Marcoux
Well, it’s a daily occurrence and I think the scale is quite shocking. So, we’re operating in milliseconds, we’re issuing these all the time, and so on any given day we issue bet ween it could be 2000, it could be 20,000 take down notices to service providers worldwide. And I think, I haven’t looked at the numbers today, but we’re at about 27 million take down notices since inception. And this is notices that have gone out to about a thousand companies in a hundred countries worldwide.
Jordan Heath Rawlings
Is that trend increasing, um, or decreasing, or is it just relatively constant?
Jacques Marcoux
That is a good question and it’s a very difficult question to answer because no one really has like a full picture of what’s happening on the internet. We have signals, like we have victimization surveys. We see what comes into our tip line from the public. We see information that comes in from law enforcement. Um, and we also see the reports that come in from companies under mandatory reporting laws, where, for example, last year in the US , the big tech companies reported about 31 million reports to the National Center for Missing and Exploited Children. So, all of that tells us that this is a growing problem and , and it’s not going away.
Jordan Heath Rawlings
You mentioned that some of this material hides in plain sight on mainstream sites. Can you give me some examples of what that looks like and how it works and, and what people might encounter?
Jacques Marcoux
Yeah. There’s this misconception that the dark web is where all of this tends to happen, and that’s really not the case. What we find is that the way most of this material is distributed is actually people are uploading these images on the regular web, on file hosting services, cloud storage, or they might post it directly on social media. And then the dark web is where they go, and they congregate, and they create these communities where they advertise and then promote and direct traffic to where these links are. And they often use coded language or specific hashtags. So that’s kind of the mechanics of how most of this is happening. And then of course there’s simply just like I said, what’s posted publicly on mainstream social media. And this is often the case with images of adolescent victims. I’m sure we’ve all read about these reports in the media. Um, I’m sure you’re well aware of the widely covered PornHub situation. And then there’s also reports like recently in the New York Times about Twitter where they have found examples of confirmed CSAM. And I mean, like I said earlier, these tech giants themselves are reporting these figures in also on your mandatory reporting laws.
So, for example, in 2021, Facebook reported 22 million reports of suspected CSAM, Instagram three and a half million, Snapchat half a million. So, this is happening on these platforms that we all use daily.
Jordan Heath Rawlings
And so, your software will issue like automatic flagging and takedown notices. What happens when those go out to these, these sites, and these companies?
Jacques Marcoux
Yeah, so in about 60 to 70% of cases when we issue takedown notice the content comes down right away within 24 hours and for the remainder and you know, 30 40% represents tens of thousands of images. It can take weeks, months, or it never comes down. And so, some web operators might design their sites in such a way that it’s difficult to notify them or to even make them aware of it.
For example, by not having contact info, or maybe they’ll make our emails bounce back, or maybe they sit behind what’s called a content delivery network, which masks details about their server locations. Sometimes websites are just plain old bad actors, and they snub their nose, and they might say, well, we don’t abide by Canadian interpretations of CSAM. Sometimes mainstream services will also, especially with adolescent victims, will argue with us about what they perceive as being the true age of this victim, even though we have age confirmation from, for example, law enforcement, but we also tackle what we call harmful and abusive content. And so, this is part of our sort of broader child protection vision because you know, harm goes a long way before you actually reach what is defined as criminal. And so other content we tackle could be, for example, distill images of the initial frames of a CSAM video where the kid might still be wearing clothes, but viewed in isolation this image isn’t actually illegal, but we feel it’s reasonable to ask for this to be removed. And so, I think the public would generally be very shocked by how many of these online services hold these, I’ll call it like a fierce techno libertarian view, that glorifies a sort of wild west philosophy about content on the internet. And it’s, to be quite frank, completely at the expense of kids and, and also a lot of adults.
Jordan Heath Rawlings
Is that really why it would happen because of Libertarian views, like, listen, I think we’re all well steeped by now in questions of free speech on the internet, but this is beyond that. I mean, we’re talking about sexual abuse of kids here. Um, why would they die on this hill, I guess is the question.
Jacques Marcoux
Well, I, I think, I mean a lot of this has to do with sort of the general laws of the land of the internet as it relates to content. So, in general, websites, hosting providers, social media platforms aren’t liable for content their users upload onto their systems, and this is due to what’s called Section 230, which is part of the Communications Decency Act in the US. And many countries have over time adopted sort of section 230-esque rules, including Canada, by virtue of our free trade agreement with the US. And so just at a very high level, these platforms that make this, this content available on their website aren’t actually violating law until they become aware of its presence and that’s the point at which they have an obligation to act. And so, because there’s no real consequences for websites for having users upload this content, as long as they take it down every time they get notified. There’s not a lot of incentive for them to do anything preventative to prevent this from happening. And again, these aren’t websites that are necessarily in Canada or North America. They’re sometimes far flung in like Eastern Europe or in islands in the Pacific that have no laws, no regulations, or no oversight. And so, it’s a really complicated web and there’s a ton of jurisdictional issues at play. Like I said, especially with adolescent images, some people will put the onus on us to say, well prove this is a minor. And when you’re dealing with millions of images, we kind of think the onus should be on websites to know what’s being put on their site.
Jordan Heath Rawlings
I guess, I assume that a lot of the sites that won’t take down this stuff immediately or that, you know, try to duck notification or whatever are the more obscure ones. What about the huge sites that you mentioned, you know, the, the Facebooks and the Instagrams of the world, the tech giants. What are they doing if anything? To prevent this kind of material from surfacing on their platform. And, you know, we keep hearing about breakthroughs in programming. This seems like something where technology could be the answer now.
Jacques Marcoux
Well, I mean, here’s the thing is there’s a lot of discussion about, for example, the use of AI as like a moderation tool. I mean, AI is powerful, but AI doesn’t capture a lot of contexts of what’s happening. AI can’t tell whether a child is seventeen or eighteen or nineteen. Um, so simple, like throwing technology at a, what’s a human behavioral problem is not going to solve the issue. And I mean, we’ve seen this repeatedly over and over. Um, websites need to have, you know, strict moderation practices. You have to have age verification tools. There’s a, there’s a host of things that they can be doing. Companies need to be using proactive detection technologies, which is very similar to what we use in Project Arachnid to preemptively block the upload of known images by searching for these hashes. And some do, and many do. And by many, I mean the large tech companies, but they don’t necessarily deploy it on all the aspects of the platform. So maybe they’ll have it on like public facing tweets or on a profile, but not through direct messaging. But a lot of, a lot of this stuff is also happening on lesser websites, not just the big tech giants. It’s on this subsection of the internet where this happens as well. Where a lot of people are, are going, I mean, there’s so many solutions, but there’s not necessarily a will to do it because a lot of that comes at a cost. There’s a cost of doing business when you, when you have a website that operates with user-generated content and moderation is expensive and moderation at scale is difficult. But I mean, um, when you see what’s happening with the impact on people, and this is beyond just child protection. This is hate. This is terrorism. This is the impact on BIPOC, on racialized groups, on journalists like this spans the gamut. And so, if we look at these companies and what they are doing, I think that we should expect a lot more of them.
Jordan Heath Rawlings
You mentioned that a lot of these sites and companies are often outside of jurisdiction from Canada. What about the ones that are, what kind of laws regulating this exist in Canada and what happens here?
Jacques Marcoux
So, in Canada there is a federal mandatory reporting law that requires that online services report to police when they become aware of the fact that they are hosting CSAM, like physically on their, on their servers that are located in Canada. And then there are also requirements for them to report when they become aware their services are making CSAM accessible to the public. For example, if they operate, say a forum and people are posting links, that point to CSAM that’s hosted elsewhere. So, they also have obligations to report this, but like you said, this is an international problem, and the reality is that most of these large companies and hosting providers that Canadians interact with daily aren’t based in Canada. So, while Canadians can access the material, a lot of it’s not actually, um, something that’s within the direct sort of reach of Canadian law or Canadian politics.
Jordan Heath Rawlings
What could be done here? What’s something that you know is in our power or government’s power to change that would make a difference? This problem sounds so widespread and the scale of it is like nothing I would’ve believed until we spoke. Is there any easy wins we could get?
Jacques Marcoux
Well, from our perspective, there’s a need for regulation. The feds are expected to introduce a bill this coming year. There’s been conversation about this on online harms framework. This is happening in Europe, in the UK, in Australia, Germany, France. Everyone’s starting to lean on this, but I mean, you know, if you just think about it, all other industries in our society, we have regulation. , if you wanna sell a toy to kids in Canada, guess what? You’re subject to 10 regulations under federal law. You have to prove it’s safe, not addictive, not going to poison your kid, not going to spiral into mental illness. But with social media, there’s, there’s no recalls if the design of a platform is shown to cause harm or put people at risk. And so, I mean, if you and I, Jordan, sat down and we had to plan out the design of an app that was safe for our kids, I think it would probably look very different than the apps that kids right now are using across Canada. You know, the other day I was watching my, my kid who’s 11 and a half she was running around collecting Easter eggs.
She still believes in the Easter Bunny. This is probably her last year, but it struck me that in just a year and a half from now, she’ll be at the age of what’s viewed as the age of majority on the, on the internet as far as social media is concerned at 13. And from there on in as it stands now, she’s able to access the same unfettered content that you and I see on social media every day. And I think that should really concern us all. We, we, we don’t do this in any area of society, but with technology, we’ve kind of thrown our hands up in the air and said like, you know, we can’t control this.
Jordan Heath Rawlings
Even if we wanted to though, could we?
Jacques Marcoux
Absolutely. We are seeing , in the UK for example, they have introduced regulation, I believe it’s called the Child Design Code. And this imposes laws or regulations or expectations, or it’s more a, it’s more design standards on social media companies. So, for example, it requires prompts that may recommend that you take a break, or it may turn off notifications after a certain time. And when the UK introduced this design code, what we saw in the weeks leading up to it, receiving royal ascent was a flurry of changes from tech companies’ announcements that they were now doing this and this and this. Of course, they don’t acknowledge that it’s related to this new regulation, but I think observers clearly recognize that it is. And so, when we see large regions in the world, like the US, the UK, the EU. You know, assert their sovereignty over what happens within their borders and, and services that are delivered to their citizens. What we see is kind of a trickle effect elsewhere. And so, if you look at social media platforms across the world, you’ll actually see different safety and design standards in different countries, even though it’s the same platform. And so, there are ways to force companies and to follow you know, a set of values that we might have within our country.
Jordan Heath Rawlings
So, as I mentioned, I was blown away by the scale of this and, and I simply didn’t understand what was going on and, and how complex the issue is. Um, and you reached out to us, or, or I probably never would’ve done this episode. What is happening in the media right now and how can journalists approach this issue differently so that we can understand the danger here and the scale of it?
Jacques Marcoux
You know, I’m a former journalist and , I was a journalist for 10 years in Canada and I’ve worked on regional, national stories and I see the requests that come in from journalists into our organization. And what I find difficult is the discourse is overwhelmingly focused on what parents should do to protect their kids instead of seeking accountability from the companies who are making these incremental design decisions each day that facilitate predation harm addiction on their platforms. Media need to systematically seek comment from companies. They need to ask for specifics about what they’re doing. They need to push them on why they don’t have these safety settings on by default. Why it’s always an opt-in and not an opt-out. They need to ask questions of their hosting providers, their payment processors, when things aren’t changing. With PornHub, for example, New York, the New York Times did this big expose, and then within days, the credit card companies threatened to pull the plug. And then we saw rapid changes with PornHub and how they created a safer environment on their, on their website.
So, you see the impact. There’s an entire chain of technology companies that support any website. For a website to operate online, it requires the support of many providers. So, reporters can put pressure on all of these different players who can have an effect, a positive effect, on how unaccountable, quite frankly, companies behave. And so, reporters really need to sort of realign how they view. These aren’t just these neutral public squares, these are marketing companies. It’s free because their business model is to generate traffic in an environment through whatever means or bells and whistles, they can provide. And then they put ads in front of your eyes, and they sell that data to third parties. That’s how all these social media companies operate. These aren’t neutral town squares. And so, we have to kind of change how we view these actors. And I, and I recognize it’s difficult. I was a reporter based in Winnipeg. Try emailing Facebook, try emailing Snapchat in the US for comment, when local police say this happened, well, you know what, you have to systematically do it. And if they don’t respond, then in your, in your report say, we sent these questions, they did not, they declined to respond. It’s so crucial that we connect the dots.
Jordan Heath Rawlings
Do you ever worry that with Project Arachnid, your work is allowing some of these companies to abdicate that responsibility because they can just simply basically wait till they get a takedown notice from you? Act on those and ignore whatever else is happening?
Jacques Marcoux
It’s a great question and a hundred percent. Like I said, it’s a game of whack-a-mole. These companies know that unless they are aware of it, if they choose to bury their head in the sand, there’s no consequences for them. And so, they rely or they wait for groups like us, watchdogs, to make them aware and then, yeah, we’ll take it down. And then we also see companies that sometimes will promote child protection groups like ours as a resource for help. So, for example, in the form of an online safety campaign. So, we’re clearly not going to refuse that or push back because we’re here to help victims at the end of the day. But you can see how troubling that is or how insidious that is, that they have environments that facilitate harm. And then when all of this comes crashing down. Well, we kind of offload this on civil society to try and pick up the pieces, but at the end of the day, if groups like ours aren’t acting as watchdogs and calling out this behavior, this issue doesn’t get addressed.
Jordan Heath Rawlings
So, what’s next for Project Arachnid? I know that tech moves fast. You must be constantly running to keep up what’s coming in the future?
Jacques Marcoux
So, the problem we have now is that Project Arachnid outpaces the rate at which our analysts can review material. It’s too good at finding suspicious content online. So, we have a backlog of images of about 40 million images that need to have eyes put on it for review. We keep building a network of NGOs internationally that help us make those assessments, and we’re always, you know, looking for more support from the international community because at the end of the day, Project Arachnid, as far as I’m concerned, is really one of the best success stories out there in this space, and it’s all happening out of Winnipeg, Manitoba.
Jordan Heath Rawlings
Jacques, thank you so much for this. , thanks for the conversation. Thanks for bringing this to our attention in the first place.
Jacques Marcoux
No problem. I’m really happy you guys saw the value in this as well.
Jordan Heath Rawlings
That was Jacques Marcoux, Director of Research and Analytics for the Canadian Centre for Child Protection. That was the big story. For more, you can head to thebigstorypodcast.ca. You can follow us on Twitter @thebigstoryfpn and you can write to us via email hello@thebigstorypodcast.ca. You can find the big story on every podcast player. If you would like it on your smart speaker, just ask it to play the Big Story podcast. Thanks for listening. I’m Jordan Heath Rawlings. We’ll talk tomorrow.
Back to top of page