Jessica melugin 2
Jessica Melugin, Director, Competitive Enterprise Institute | https://x.com/melugin_p/photo

Weekend Interview: Jessica Mulligan Discusses Concerns Over the Kids Online Safety Act

ORGANIZATIONS IN THIS STORY

Jessica Melugin is the Director of the Center for Technology and Innovation at the Competitive Enterprise Institute. 

This transcript has been edited for length and clarity.

Federal Newswire: What problem is the Kids Online Safety Act (KOSA) attempting to solve?

Melugin: It springs from the good intentions and concerns that people have about some of the content that kids are encountering online. Also, [from] the psychological effects and the amount of time spent online. There is also a bit of a targeted advertising concern–the idea of kids' data being used to more accurately target them with ads. But I think it's mostly a content question. 

There's probably a large swath of content that we can agree we don't want any kids exposed to. There's a large amount of content I don't want to be exposed to. But there's a lot of stuff that is in the gray area. One kid might find [information] very educational about nutrition or fitness, and then another kid who struggles with body image concerns or has an eating disorder would find it triggering.

That is a perfect example of speech that we can say is obviously constitutionally protected. But it's going to have different impacts on different people because we're individuals.

Federal Newswire: What is the concern over KOSA?

Melugin: I think it's good to give credit to a lot of interested parties here. Keeping kids safe online is a very good impulse. We should all be looking for ways to keep our kids safe. 

I'm a parent. I have two kids. We parents of America are tired...I don't know what I'm going to make for dinner tonight. There's laundry I should be doing. I know you don't need to hear one more thing [you should be doing], but you have to be in charge.

Unfortunately, [a top down solution comes with a lot of bad tradeoffs. There's no government substitute for being connected to your kids and being informed about what's going on in their lives]. It's really the parents who are in the best position to manage that challenge. 

Federal Newswire: Would there be unintended consequences from passage of KOSA that are not being discussed?

Melugin: KOSA says that big online platforms have a duty of care. That means they should know which users are minors. A myriad of practical problems come with that. How will they know who is a minor and who is not a minor?

If a company is on the hook legally, [they are] going to want to know. [They will] find a way to figure that out. What that means in practice is that everyone, not just kids, is going to have to prove to these platforms that [they are] not a kid.

Maybe that involves bio identification, such as a video. Or upload a government ID. But then there is a privacy red flag. [paragraph ends] If we are all concerned about these companies having so much information, why mandate that everyone upload their government ID? If these companies [are required] to know who kids are [they will be] incentivized to make age verification the norm.[paragraph ends] On a theoretical note, [this would be] the end of a lot of anonymous speech online. 

Federal Newswire: Would the bill eliminate anonymous speech?

Melugin: That is the speech that often is in need of the most protection. If you feel like you are in danger, and are in a world where cancel culture is prevalent, do you really want to get rid of anonymous speech? What does that do to whistleblowers?[paragraph ends] What does that do to people who have a minority opinion in a group? We don't want to put pressure on that to go away. 

A second consequence is companies don't want to spend time litigating this stuff in court. When we talk about the easy calls of completely abhorrent content, major platforms aren't showing that. You have to ask for that kind of content. They understand they have a business incentive to keep it a place where you feel good about it and your kid feels good being there. 

When the calls are less clear, such as fitness or nutrition like we talked about, they're going to take it down just to be better safe than sorry. That includes an enormous amount of constitutionally protected speech that might be really useful to people. 

If I'm trying to lose 15 pounds or get stronger to make the soccer team, I might like some information on nutrition. If there are hard calls, they're just going to say it's not worth the legal cost of getting busted for it because they show it to the wrong kid.[paragraph ends] I'm not at all diminishing the possibility that a kid has a terrible reaction to something because of what they experience. [However,] legal responsibility for such a vague standard is going to result in companies just shutting down a ton of speech and restricting content. I do not think that that is good for the overall health of the internet or our society. 

Federal Newswire: Would KOSA give companies a pretext to remove even more content?

Melugin: Yes. Anytime you increase the liability for these platforms, it is creating incentive for them to take down more content. I wish that was something that more people understood.

I say that regarding both sides. I would not count on my team being able to protect free speech. I rather would count on First Amendment protections to keep free speech flowing, even speech I disagree with. [paragraph ends] That's just a much better system than putting the FTC in charge of [deciding] what's harmful and what's not. This is not a child protection specialist organization. 

The other group are state attorneys general. However you feel about your state attorney general, it's a politicized position. I don't think we want people in that position making the calls on what speech is harmful or not. 

Federal Newswire: Does placing more mandates on these companies offer a greater possibility of collusion between the government and the platforms to censor content that the government doesn't like?

Melugin: The Twitter files showed us that–’this is a nice platform you have here, it would be a shame if something happened to it.’

Why increase the possibility of those pressures? We should be pushing those out. We should be minimizing the government’s opportunity to decide what is appropriate for our kids and what is not. 

I have two kids and I shelter them from much of the world. But the point is, that's for me to do. There are a million decisions I make as a parent every day, and online is not any different. And there are lots of tools and lots of services at every level, not just on the platforms. 

We are concerned about our kids and what they're seeing, but it's a knee-jerk reaction to say, ‘could the federal government just fix it?’ I'm here to tell you, ‘no, they cannot.’ 

But it is fixable. It is getting easier every day, because there's such a profit incentive for people to solve this problem for parents, until you start passing all these laws and then that virtuous cycle of innovation gets short-circuited. 

People think the government has got it, but they don't, and I would like to personally see more innovation on this that empowers parents to make these decisions.

Federal Newswire: How does AI play a role in this debate?

Melugin: Obviously, like any technology that's ever been invented, there are risks. There will be challenges with AI. In free societies, this is what the innovative process deals with.

Nothing changes with AI and online safety, AI’s potential to help these platforms eliminate content that is inappropriate for children is phenomenal. They are already employing that. Anything we do to deter the progress of AI in a million different areas, but certainly in this one, is really a disservice that we're doing based on fear and the ‘precautionary principle.’

Federal Newswire: What is the “precautionary principle”?

Melugin: The simplest way to explain it is, you can't do anything new until you can prove that nothing bad will happen.

For anyone who's an entrepreneur, anyone who's in science or medical research, that is just completely opposite of how learning and progress works.

That is not how we live as Americans. You cannot eliminate risk to zero, and we can deal with problems as they come up. 

Federal Newswire: What is the status of the KOSA bill? 

Melugin: It passed the Senate this past July and then it got voted out of the House by the Energy and Commerce Committee. The whole party will start up again in the new Congress. I'd be surprised if it isn’t reintroduced. 

The idea that Congress is going to schedule it in time and vote where they have to be on the record right before an election is possible, but I doubt it.

Federal Newswire: Where can people go to follow your work?

Melugin: Go to CEI.org and twitter @melugin_p. 

ORGANIZATIONS IN THIS STORY

More News