MOUNTAIN VIEW, Calif. — The Federal Trade Commission is in advanced stages of an investigation into YouTube’s handling of videos aimed at children, according to two people with knowledge of the inquiry.
The investigation, which could result in fines against YouTube, comes after complaints by parents and consumer groups that the video giant had collected data of young users.
The groups also complained that YouTube allowed harmful and adult content to appear in searches for children’s content, said the two people, who were not authorized to speak about the investigation because it was private.
Misinformation and inappropriate content also appeared in recommendation engines, according to the complaints.
The F.T.C. is pursuing investigation of YouTube as regulators and lawmakers in Washington are signaling their interest in curbing the power and influence of some of the biggest tech companies.
The House Judiciary Committee announced a broad antitrust investigation into Big Tech earlier this month. And the two top federal antitrust agencies, the Justice Department and the F.T.C., agreed to divide oversight over Apple, Amazon, Facebook and Google as they explore whether the companies have abused their market power to harm competition and consumers.
The Washington Post was the first to report the news of the F.T.C.’s investigation of YouTube.
YouTube’s main site and app are for viewers 13 and older. The company directs younger children to the YouTube Kids app, which contains a filtered set of videos from the main site.
YouTube’s distinction between its main product and YouTube Kids is significant because of the rules on disclosure and parental consent that kick in for sites with “actual knowledge” that they are trafficking in the personal information of children under 13.
But consumer advocacy groups have argued that YouTube, which is owned by Google, is able to collect data on children under 13 through its main site, where cartoons, nursery-rhyme videos and those ever-popular toy-unboxing clips garner millions of views.
Dealing with children’s videos are particularly thorny for YouTube. Children are among the most avid users of YouTube and videos geared toward them are popular on the platform. However, YouTube has struggled to keep inappropriate content away from children’s videos, in part because of the volume of videos being uploaded to the platform.
In February, YouTube was rocked by a video documenting how pedophiles used the comments on videos of children to guide other predators. After brands announced plans to boycott YouTube, the company said it would disable comments on most videos featuring kids under 13 years old.
Earlier this month, The New York Times published an investigation into how YouTube’s automated recommendation system promoted videos of scantily-clad children to people who had watched other videos of young children in compromised positions or sexually themed content.
In response, the company has been considering significant changes to its handling of children’s videos including how its algorithms work with the videos, according to the two people briefed on the talks. The Wall Street Journal earlier reported the internal discussions.
“We consider lots of ideas for improving YouTube and some remain just that — ideas. Others, we develop and launch, like our restrictions to minors live streaming or updated hate speech policy,” Andrea Faville, a YouTube spokeswoman, said in a statement.
This is a developing story.