AP Explains: YouTube agrees to change what it shows to kids

YouTube is changing what it shows to kids as part of a settlement with the Federal Trade Commission.

The new business practices are in addition to $170 million in fines that YouTube parent Google will pay the FTC and New York state to settle charges of violating children’s online privacy by collecting personal data without parents’ permission.

Some of the new responsibility will be on video creators themselves, as they will have to label videos that are geared toward kids under 13.

Here’s a look at what’s behind the dispute and what’s changing.

WHAT THE LAW SAYS

The FTC’s complaint is based on a 1998 federal law called the Children’s Online Privacy Protection Act, or COPPA. It bans websites from collecting personal information from children under 13 without their parents’ consent.

Tech companies, however, have long skirted this by saying they officially exclude kids from their services, even if they don’t really check. A group of privacy advocates filed a formal request in April 2018 asking the FTC to investigate YouTube‘s compliance.

MIXED MESSAGES

YouTube has long said its service is intended for ages 13 and older, a message that theoretically kept it in line with that law.

Ask any kid or their parent, however, and the reality was far different. Younger kids commonly watch videos on YouTube, and many popular YouTube channels feature cartoons or sing-a-longs made for children. YouTube acknowledged Wednesday that “the likelihood of children watching without supervision has increased” since its founding because there are more shared devices and a “boom in family content.”

The FTC’s complaint details how Google boasted about its youthful audience when talking to major advertisers. The FTC includes as evidence Google’s visual presentations made to toy companies Mattel and Hasbro where YouTube is described as the “new Saturday Morning Cartoons” and the “#1 website regularly visited by kids.”

WHAT ACTUALLY CHANGES

Starting early next year, anyone who uploads a video to YouTube will have to designate whether or not that video is directed at children.

If a video is identified as child-focused, such as a cartoon or the “unboxing” of a new toy, Google has agreed not to put up “behavioral” ads — those that cater to specific viewers based on their age and other social characteristics. Google also won’t track the viewers’ online identities. Google says these restrictions will be in place regardless of the viewer’s age.

But Google will still show generic ads, as well as “contextual” ads — those that cater to the type of content rather than the specific viewer. These typically don’t bring in as much money as viewer-specific ads.

And Google is stopping short of seeking parental consent on its main service, even for kids-focused video. The law doesn’t require it to, as long as there’s no data collection. Google already gets parental consent for its kids-focused service, YouTube Kids.

NEW ONUS ON CREATORS

Google says the changes to the main service will happen in four months to give video creators a chance to adjust. In taking this approach, Google is putting much of the responsibility on video creators themselves, though the company says it will also use artificial intelligence to flag content that targets children but wasn’t properly identified as such.

Those who consider the settlement too weak are already concerned about what happens when video creators try to cheat the new system.

Democratic FTC Commissioner Rebecca Kelly Slaughter, in a dissenting opinion, say high-profile companies like Hasbro and Mattel will likely comply, as they won’t want to run afoul of federal rules even it means fewer kids seeing their toy promotions.

But she says it’s less clear how it will curb abuses by the millions of others who post videos on YouTube — especially those outside the United States who are beyond the FTC’s “practical reach.”

Source link