FTC investigates impact of AI chatbots on children’s safety

Webp 5pp2ko12vzx927qgsfze54lv4smd
Andrew N. Ferguson Chairman | Federal Trade Commission

FTC investigates impact of AI chatbots on children’s safety

ORGANIZATIONS IN THIS STORY

The Federal Trade Commission (FTC) has launched an inquiry into the practices of seven companies that provide AI-powered chatbots, focusing on how these firms address potential risks to children and teenagers. The FTC is seeking information on how these companies measure, test, and monitor the negative impacts of their chatbot technologies.

AI chatbots use generative artificial intelligence to simulate human-like communication and relationships with users. These systems are designed to mimic human emotions and characteristics, often communicating in a way that resembles a friend or confidant. This design may encourage users—especially younger ones—to trust and form bonds with the chatbots.

The agency’s investigation aims to determine what steps have been taken by these companies to evaluate the safety of their products when used as companions, restrict usage by minors, minimize negative effects on young people, and inform both users and parents about associated risks.

“Protecting kids online is a top priority for the Trump-Vance FTC, and so is fostering innovation in critical sectors of our economy,” said FTC Chairman Andrew N. Ferguson. “As AI technologies evolve, it is important to consider the effects chatbots can have on children, while also ensuring that the United States maintains its role as a global leader in this new and exciting industry. The study we’re launching today will help us better understand how AI firms are developing their products and the steps they are taking to protect children.”

The orders were issued under Section 6(b) authority, which allows the FTC to conduct broad studies not tied directly to law enforcement actions. The companies receiving orders include Alphabet Inc., Character Technologies Inc., Instagram LLC, Meta Platforms Inc., OpenAI OpCo LLC, Snap Inc., and X.AI Corp.

A key focus for the FTC is understanding how these platforms affect children specifically and what measures are being implemented to mitigate any negative outcomes or comply with regulations such as the Children’s Online Privacy Protection Act Rule.

The commission’s inquiry covers several areas: monetization strategies for user engagement; processing of user inputs; character development processes; testing protocols before deployment; mitigation efforts regarding harm—particularly for minors; disclosure practices about features or data handling; monitoring compliance with internal rules like age restrictions; and sharing or use of personal information collected through conversations.

The decision to issue these orders was unanimous among commissioners. Melissa Holyoak and Mark R. Meador provided separate statements regarding this action.

Alysa Bernstein and Erik Jones from the Bureau of Consumer Protection are leading staff members for this matter.

The FTC continues its mission to promote competition while protecting consumers through education efforts available at consumer.ftc.gov. Consumers can report fraud or bad business practices at ReportFraud.ftc.gov.

ORGANIZATIONS IN THIS STORY