Img0051
Mike Prado, Deputy Assistant Director, Homeland Security Investigations | https://www.dhs.gov/medialibrary-ns-assets/prod/dhs/a/f/c/b/c/9/c/e/36154/20220616_secretarys_awards_ek_-1772.jpg

Weekend Interview: Inside DHS's Expanding Fight Against Online Child Exploitation and AI-Driven Threats

ORGANIZATIONS IN THIS STORY

This transcript has been edited for length and clarity.

Mike Prado is the deputy assistant director of the Homeland Security Investigations, a cybercrime center at the Department of Homeland Security.

Has DHS always investigated cyber crimes?

It has certainly been the law enforcement leader in the federal government when it comes to online child sexual exploitation and abuse. That has since evolved into an increased mission to investigate and prevent cyber-enabled cyber crime in the form of business email compromise, ransomware, network intrusion, and things of that nature. 

Anything with a transnational nexus is in the investigative arm of DHS, and therefore would have investigative jurisdiction–and that includes cyber crime.

Do you only investigate cyber crimes within America or outside as well?

Our primary focus is transnational. So, yes, between borders, and including the cyber border. We do domestic investigations, but generally that traces back to a transnational nexus. We work very closely with our partners at the Department of Justice, as well.

When it comes to child exploitation and sextortion crimes, is the problem growing?

Online child sexual exploitation and abuse have been around since the inception of the internet. That's where we originally got our foot in the door in the investigation of cybercrime. However, the exponential growth of online platform child exploitation has been commensurate with [increased] accessibility online.

This is a crime that is a top priority. They say it's the second largest investigative discipline, which explains more than 1.2 million agent hours per year investigating this crime. Sadly, the crime continues to grow. 

To give you some context, we work very closely with the National Center for Missing and Exploited Children. We have agents and analysts embedded with those folks. They receive the lion's share of cyber tip reports from private industry social media platforms and a third of providers. 

When I started in this business decades ago, about 12 million kids were coming into the cyber tip line annually. That has now grown to 36 million just this past year. There are not enough law enforcement agents, detectives, or police officers in the United States for the world to address that type of exponential growth.

Is it growing because the problem’s are getting worse or because law enforcement is getting better at catching and identifying it?

Bottom line, the problem's growing because as more people get online, it's more accessible. We have seen a shift from what we would call the open web to much more activity within the dark web, involving child sexual exploitation and abuse. We've seen a lot now because of social media, because of the increase in smart mobile devices, the self-production of child sexual exploitation and abuse.

We've seen a recent increase in the commercial sale of child sexual abuse material, using cryptocurrency to purchase and obtain child exploitation material. 

Generative AI is the next big wave for law enforcement. We're right in the middle of experiencing a major growth in in the use of generative AI tools to create, or alter, previously existing child sexual abuse material.

What are the legal issues when it comes to deep fakes and AI generated pictures?

I'll start with the legality. There is a misconception out there that AI generated images of child exploitation are somehow in a gray area or even legal. That cannot be further from the truth. We work very closely with the Department of Justice Child Exploitation and Obscenity section here in Washington, D.C..

There has been extensive case law established, all the way up to the Supreme Court, determining that AI-generated images of child exploitation are, in fact illegal or charged under 18 U.S.C. 1466, which is an extended section on the books in the United States Code. We've worked very closely with every U.S. Attorney's office across the United States to help educate federal prosecutors on how to prosecute those to the fullest extent of the law.

When our agents go in and seize material from suspects on their devices, not only do we find generative AI material, but we ultimately find well-known child exploitation. It's a serious crime, and it's a growing crime.

How does AI play a role in this?

We're talking about individuals who are using open source software, a lot of which is publicly known. But AI software that can be used to create, text or picture [anything] you can conjure up, depending on the type of software being used. 

We were working with industry at the cybercrime center, including some of the best known tech companies in the world to try to come up with technical solutions to mitigate this problem. The fact is, though, once the genie is out of the bottle–and this is open source information–it becomes very difficult if someone is committed and interested in producing that type of illegal material. But we will investigate them vigorously and aggressively.

Is this only a deep-fake issue or do illustrations count, and how real does it have to look?

It runs the spectrum. Now, obviously the majority of what we're focused on here in 2024 is AI generated. Our primary focus is not on drawings, cartoons, or things like that. It is on what generative AI software has enabled individuals with a proclivity for a sexual interest in children to create–images that you cannot differentiate between real and what's been created. 

How do your agents stay mentally healthy when investigating such perverse material?

Wellness and safety is the top priority for our workforce. Having been an agent who worked these cases almost exclusively for eight years at the beginning of my career, you're absolutely correct on planning for trauma.

We have a robust peer-support program that we call the ‘armor program.’ We require our agents who are involved in these investigations to talk to professionals about what they're experiencing. We're monitoring our workforce to ensure that they remain healthy, and provide healthy outlets so agents don't work for too long.

Nobody is forced to work on these types of investigations. It's solely voluntary. But I can tell you there's no shortage of volunteers who work this. Because while it's absolutely heart-wrenching to investigate, there's no higher calling, in my opinion, than what the men and women of our agency and law enforcement in general are doing when it comes to protecting children.

When you save a child from a home where that child is too young to speak or is being mentally disabled, the trauma is worth it. It's a tough business to be in, but it's highly rewarding. 

How do you find the criminals when their technology continues to advance?

It's a continuous challenge, because the technology in this field of investigation changes rapidly, not only year to year, but almost week to week. 

We’re staying on top of it. It's not just through purchase of certain hardware systems, but through the licenses we acquire from the private sector. Also the tools that we developed in-house. 

For example, we talk about generative AI from an adversarial standpoint, but we're working within DHS and the president's executive order for responsible use of AI to develop systems that can very quickly go through sometimes terabytes of data from a suspect. How do we quickly go through that?

To reduce trauma in our workforce, we’re trying to reduce the time that an agent or an examiner has to look at material. We are coming up with automated tools that can do that for the agent.

That's a real game-changer for us, because it enables us to very quickly turn around a forensic report for prosecution. It minimizes the amount of time that our personnel have to view this material. 

What can parents do to protect their children?

Parents and other adults are absolutely key to this. Children are online more and more. That's just the reality of the time that we live in now. 

You're opening your home to a stranger when your child is online. Having conversations with your child about the threats that exist online is absolutely critical. Keep lines of communication with your child [open], so they can come to you.

To address the exponential growth in this activity, we have come up with the first of its kind, government-led, public awareness campaign. At Protect.gov, there are safety tips that minimize the opportunity for an individual to make contact with a child, and on how to put settings on social media platforms or on devices, and tips on what to do if a child is solicited online.

What is the Hero Program?

Our Hero Program is in its 11th year. We just graduated 28 heroes this year–three weeks ago as a matter of fact, at a ceremony where I had the good fortune of speaking. We train the heroes at the cyber-crime center here outside of Washington, DC.

What we do is, through a highly selective process, bring them in, provide them full training, and equip them with full time oversight. They become certified computer forensic examiners, primarily dedicated to investigating child exploitation. After a year, those heroes become full-time agency employees as computer examiners.

It's a tremendous program. It goes up to the GS-14 level. As far as career path, there's no shortage of volunteers and applicants for that program. 

It's an absolute force-multiplier for what I think is probably the biggest single choke point in these investigations–the extraction and analysis of data. Once our agents go into a home or business and see these devices, they have potential evidence to evaluate.

If an average person comes across these crimes, how should he or she report them?

There are multiple ways. If it's an emergency situation, we want them to report to their local law enforcement. That's first and foremost the way to go. 

They can also go to protect.gov, and there's a link there to provide a tip. 

HSA also has a tip line that is open 24 hours a day, seven days a week. Listeners can report a crime like online child exploitation directly at 1-877-4-HSI-TIP

ORGANIZATIONS IN THIS STORY

More News