World

Children’s Groups Want F.T.C. to Ban ‘Unfair’ Online Manipulation of Kids

My Talking Tom, an animated video game featuring a pet cat, is one of the most popular apps for young children. To advance through the game, youngsters must care for a wide-eyed virtual cat, earning points for each task they complete.

The app, which has been downloaded more than a billion times from the Google Play Store, also bombards children with marketing. It is crowded with ads, constantly offers players extra points in exchange for viewing ads and encourages them to buy virtual game accessories.

“Every screen has multiple traps for your little one to click on,” Josiah Ostley, a parent, wrote in a critical review of the app on the Google Play store last month, adding that he was deleting the app.

Now some prominent children’s advocacy, privacy and health groups want to ban user-engagement techniques that, they say, unfairly steer the behavior of minors and hijack their attention. On Thursday morning, a coalition of more than 20 groups filed a petition asking the Federal Trade Commission to prohibit video games like My Talking Tom, as well as social networks like TikTok and other online services, from employing certain attention-grabbing practices that may hook children online.

In particular, the groups asked regulators to prohibit online services from offering unpredictable rewards — a technique that slot machines use — to keep children online.

The groups also asked the agency to prohibit online services from using social-pressure techniques, like displaying the number of likes that children’s social media posts garner, and endless content feeds that may cause children to spend more time online than they may have wished.

Some popular game apps like My Talking Tom offer children virtual rewards like extra points in exchange for watching ads.

The petition to federal regulators warned that such practices might foster or exacerbate anxiety, depression, eating disorders or self-harm among children and teenagers.

“Design features that maximize minors’ time and activities online are deeply harmful to minors’ health and safety,” the children’s activists wrote in the petition. “The F.T.C. can and must establish rules of the road to clarify when these design practices cross the line into unlawful unfairness, thus protecting vulnerable users from unfair harms.”

The coalition was led by Fairplay, a nonprofit children’s advocacy group, and the Center for Digital Democracy, a children’s privacy and digital rights group. Other signatories included the American Academy of Pediatrics and the Network for Public Education.

Outfit7, the developer of My Talking Tom, did not immediately return an email seeking comment.

The F.T.C. petition comes at a moment when legislators, regulators and health leaders in the United States and abroad are increasingly scrutinizing the online tracking and attention-hacking practices of popular online platforms — and trying to mitigate the potential risks to children. In doing so, these activists are challenging the business model of apps and sites whose main revenue comes from digital advertising.

Online services like TikTok, Instagram and YouTube routinely employ data-harvesting techniques and compelling design elements — like content-recommendation algorithms, smartphone notices or videos that automatically play one after another — to drive user engagement. The more time people spend on an app or site, the more ads they are likely to view.

Now, legislators, regulators and children’s groups are taking a new approach to try to curb the use of such attention-hacking practices on minors. They are trying to hold online services to the same kinds of basic safety standards as the automobile industry — essentially requiring apps and sites to install the digital equivalent of seatbelts and airbags for younger users.

Last year, for instance, Britain instituted comprehensive online safeguards for young people, known as the Children’s Code. The new rules require social media and video game platforms likely to be used by minors to turn off certain features that could be detrimental — like barraging users with notifications at all hours of the night — by default for younger users.

Before the British rules went into effect, TikTok, YouTube, Instagram and other popular services bolstered their safeguards for younger users worldwide.

In September, California also enacted a law requiring sites and apps likely to be used by minors to install wide-ranging safeguards for users under 18. Members of Congress have introduced two bills — the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act — intended to bolster online protections for youngsters.

Civil liberties experts have argued that the safeguards could also have deleterious consequences. The measures, they argue, could subject children to increased surveillance, potentially deterring vulnerable young people from finding online resources on sensitive issues like reproductive health or gender identity.

Young people themselves report mixed feelings about their online activities. In a survey of roughly 1,300 teenagers in the United States, published on Wednesday by the Pew Research Center, 80 percent said social media made them feel more connected to their friends’ lives. About 30 percent also said they felt that social media had a negative effect on people their age.

With Congress split after the midterm elections, the Federal Trade Commission may end up regulating attention-hacking techniques on children before federal legislators do.

In August, the agency posted a notice asking the public to weigh in on whether new rules were needed to protect consumers from “commercial surveillance” — that is, software services that amass data on individual consumers and use it to try to steer their behavior. The notice posed a series of questions specifically related to children.

“Do techniques that manipulate consumers into prolonging online activity,” such as quantifying the number of likes on social media posts, “facilitate commercial surveillance of children and teenagers?” the regulators asked. “Is it an unfair or deceptive practice when a company uses these techniques despite evidence or research linking them to clinical depression, anxiety, eating disorders or suicidal ideation among children and teenagers?”

Back to top button