CDD Calls for Cross-Platform Safeguards to Protect Children and Adolescents
Last February, Google launched its YouTube Kids (YTK) app in the U.S., which it described as a “safe” environment for children five and younger. The app enables children to access a wide range of video content on their mobile devices. But in creating YTK, Google ignored the need to incorporate programming and advertising safeguards for children, which have long been required of U.S. television providers (both broadcast and cable). The app is filled with advertising that is directly integrated into the flow of programming content, including long-form (30-minute) product-placement pitches; content inappropriate for young children (such as videos “that model unsafe behaviors such as playing with lit matches, shooting a nail gun, juggling knives, tasting battery acid, and making a noose”); and endless pitches for junk food—even though Google promised YTK would not include such ads. CDD and other consumer groups have filed a series of complaints at the FTC about YTK. Nevertheless, the app has been extremely successful, and recently launched in the U.K., Ireland, and Canada.
With children increasingly viewing content on a range of devices, especially mobile phones, there is a global digital gold rush by leading commercial providers (Amazon Netflix, and companies from the EU) who now offer new forms of child-focused cross- platform programming. Youth generate or influence $1.2 trillion a year in revenues, for example. But while there are limits and rules in most countries regarding the delivery of video content to children on television, there are few such policies when the same content is delivered via mobile phones, apps, or social media. Privacy issues are also a concern, given the ubiquitous nature of data collection across devices and applications. Companies such as Google may claim they are not directly gathering information on young children (to comply with the U.S. children’s privacy law, COPPA, for example). But an array of sophisticated analytic and measurement techniques is at work to help document and “monetize” how children interact with digital content.
Another issue is that parental consent can trigger ongoing data profiling and targeting, including real-time and location-aware practices that have consequences few parents understand. For example, continuous behavioral targeting can ensue if a parent agrees to having online content “personalized” for a particular child. Or, in exchange for the data gathering that allows for such personalization, “free” content will be offered. There is a range of powerful advertising tactics that—when directed at children—require safeguards beyond mere parental consent.
For adolescents in the U.S. there are no data or ad safeguards. Once they turn 13, teens are treated as if they were adults by online marketers. In the EU, the forthcoming GDPR raises the age requiring parental consent to 16, but Member States can opt out of that requirement. Some observers—including those backed by industry—have criticized this provision, concerned that teens will be denied access to important content and services (especially on social media).
—-
Full article attached.