CDD

Publishings Digital Youth

  • Reining In Meta’s Digital ‘Wild West’ as FTC protects young people’s safety, health and privacyContacts:Jeff Chester, CDD, 202-494-7100David Monahan, Fairplay, 781-315-2586Children’s advocates Fairplay and Center for Digital Democracy respond to today’s announcement that the FTC proposes action to address Facebook’s privacy violations in practices impacting children and teens.  And see important new information compiled by Fairplay and CDD, linked below.Josh Golin, executive director, Fairplay:The action taken by the Federal Trade Commission against Meta is long overdue. For years, Meta has flouted the law and exploited millions of children and teens in their efforts to maximize profits, with little care as to the harms faced by young users on their platforms. The FTC has rightly recognized Meta simply cannot be trusted with young people’s sensitive data and proposed a remedy in line with Meta’s long history of abuse of children. We applaud the Commission for its efforts to hold Meta accountable and for taking a huge step toward creating the safe online ecosystem every young American deserves.Jeff Chester, executive director, Center for Digital Democracy:Today’s action by the Federal Trade Commission (FTC) is a long-overdue intervention into what has become a huge national crisis for young people. Meta and its platforms are at the center of a powerful commercialized social media system that has spiraled out of control, threatening the mental health and wellbeing of children and adolescents. The company has not done enough to address the problems caused by its unaccountable data-driven commercial platforms. Amid a continuing rise in shocking incidents of suicide, self-harm and online abuse, as well as exposés from industry “whistleblowers,” Meta is unleashing even more powerful data gathering and targeting tactics fueled by immersive content, virtual reality and artificial intelligence, while pushing youth further into the metaverse with no meaningful safeguards. Parents and children urgently need the government to institute protections for the “digital generation” before it is too late. Today’s action by the FTC limiting how Meta can use the data it gathers will bring critical protections to both children and teens. It will require Meta/Facebook to engage in a proper “due diligence” process when launching new products targeting young people—rather than its current method of “release first and address problems later approach.” The FTC deserve the thanks of U.S parents and others concerned about the privacy and welfare of our “digital generation.”NEW REPORTS:META HAS A LONG HISTORY OF FAILING TO PROTECT CHILDREN ONLINE(link is external)(from Fairplay)META’S VIRTUAL REALITY-BASED MARKETING APPARATUS POSES RISKS TO TEENS AND OTHERS(from CDD)
     by
  • Advocates Fairplay, Eating Disorders Coalition, Center for Digital Democracy, and others announce support of the newly reintroduced Kids Online Safety ActContact:David Monahan, Fairplay (david@fairplayforkids.org)Advocates pledge support for landmark bill requiring online platforms to protect kids, teens with “safety by design” approachAdvocates Fairplay, Eating Disorders Coalition, Center for Digital Democracy, and others announce support of the newly reintroduced Kids Online Safety ActBOSTON, MA and WASHINGTON, DC — May 2, 2023 — Today, a coalition of leading advocates for children’s rights, health, and privacy lauded the introduction of the Kids Online Safety Act (KOSA), a landmark bill that would create robust online protections for children and teens online. Among the advocates pledging support for KOSA are Fairplay, Eating Disorders Coalition, the American Academy of Pediatrics, the American Psychological Association, and Common Sense.KOSA, a bipartisan bill from Senators Richard Blumenthal (D-CT) and Martha Blackburn (R-TN), would make online platforms and digital providers abide by a “duty of care” requiring them to eliminate or mitigate the impact of harmful content on their platforms. The bill would also require platforms to default to the most protective settings for minors and enable independent researchers to access “black box” algorithms to assist in research on algorithmic harms to children and teens.The reintroduction of the Kids Online Safety Act coincides with a rising tide of bipartisan support for action to protect children and teens online amidst a growing youth mental health crisis. A February report from the CDC showed that teen girls and LGBTQ+ youth are facing record levels of sadness and despair, and another report from Amnesty International indicated that 74% of youth check social media more than they’d like.Fairplay Executive Director, Josh Golin:“For far too long, Big Tech have been allowed to play by their own rules in a relentless pursuit of profit, with little regard for the damage done to the children and teens left in their wake. Companies like Meta and TikTok have made billions from hooking kids on their products by any means necessary, even promoting dangerous challenges, pro-eating disorder content, violence, drugs, and bigotry to the kids on their platforms. The Kids Online Safety Act stands to change all that. Today marks an exciting step toward the internet every young person needs and deserves, where children and teens can explore, socialize and learn without being caught in Big Tech crossfire.”National Alliance for Eating Disorders CEO and EDC Board Member, Johanna Kandel:“The Kids Online Safety Act is an integral first step in making social media platforms a safer place for our children. We need to hold these platforms accountable for their role in exposing our kids to harmful content, which is leading to declining mental health, higher rates of suicide, and eating disorders. As both a CEO of an eating disorders nonprofit and a mom of a young child, these new laws would go a long way in safeguarding the experiences our children have online.”Center for Digital Democracy Deputy Director, Katharina Kopp:“The Kids Online Safety Act (KOSA), co-sponsored by Senators Blumenthal and Blackburn, will hold social media companies accountable for their role in the public health crisis that children and teens experience today. It will require platforms to make better design choices that ensure the well-being of young people. KOSA is urgently needed to stop online companies operating in ways that encourage self-harm, suicide, eating disorders, substance use, sexual exploitation, patterns of addiction-like behaviors, and other mental and physical threats.  It also provides safeguards to address unfair digital marketing tactics. Children and teens deserve an online environment that is safe. KOSA will significantly reduce the harms that children, teens, and their families experience online every day.”Children and Screens: Institute of Digital Media and Children Development Executive Director, Kris Perry:“We appreciate the Senators’ efforts to protect children in this increasingly complicated digital world. KOSA will allow access to critical datasets from online platforms for academic and research organizations. This data will facilitate scientific research to better understand the overarching impact social media has on child development."###kosa_reintro_pr.pdf
  • Statement from Children’s Advocacy Groups on New Social Media Bill by U.S. Senators Schatz and CottonWashington, D.C., April 26, 2023– Several children’s advocacy groups expressed concern today with parts of a new bill intended to protect kids and teens from online harms.  The bill, “The Protecting Kids on Social Media Act,” was introduced this morning by U.S. Sens. Brian Schatz (D-HI) and Tom Cotton (R-AR).The groups, including Common Sense Media, Fairplay, and The Center for Digital Democracy, play a leading role on legislation in Congress to ensure that tech companies, and social media platforms in particular, are held accountable for the serious and sometimes deadly harms related to the design and operation of these platforms. They said the new bill is well-intentioned in the face of a youth mental health crisis and has some features that should be adopted, but that other aspects of the bill take the wrong approach to a serious problem.The groups said they support the bill’s ban on algorithmic recommendation systems to minors, which would prevent platforms from using personal data of minors to amplify harmful content to them. However, they said they object to the fact that the bill places too many new burdens on parents and creates unrealistic bans and institutes potentially harmful parental control over minors’ access to social media. By requiring parental consent before a teen can use a social media platform, vulnerable minors, including LGBTQ+ kids and kids who live in unsupportive households, may be cut off from access to needed resources and community. At the same time, kids and teens could pressure their parents or guardians to provide consent. Once young users make it onto the platform, they will still be exposed to addictive or unsafe design features beyond algorithmic recommendation systems, such as endless scroll and autoplay. The bill’s age verification measures also introduce troubling implications for the privacy of all users, given the requirement for covered companies to verify the age of both adult and minor users. Despite its importance, there is currently no consensus on how to implement age verification measures without compromising users’ privacy. The groups said that they strongly support other legislation that establish important guardrails on platforms and other tech companies to make the internet a healthier and safer place for kids and families, for example the Kids Online Safety Act (KOSA), COPPA 2.0, bi-partisan legislation that was approved last year by the Senate Commerce Committee and expected to be reintroduced again this year.“We appreciate Senators Schatz and Cotton's effort to protect kids and teens online and we look forward to working with them as we have with many Senators and House members over the past several years. But this is a life or death issue for families and we have to be very careful about how to protect kids online. The truth is, some approaches to the problem of online harms to kids risk further harming kids and families,” said James P. Steyer, founder and CEO of Common Sense Media. “Congress should place the onus on companies to make the internet safer for kids and teens and avoid placing the government in the middle of the parent-child relationship. Congress has many good policy options already under consideration and should act on them now to make the internet healthier and safer for kids.”“We are grateful to Senators Schatz, Cotton, Britt and Murphy for their efforts to improve the online environment for young people but are deeply concerned their bill is not not the right approach,” said Josh Golin, Executive Director of Fairplay. “ Young people deserve secure online spaces where they can safely and autonomously socialize, connect with peers, learn, and explore. But the Protecting Kids on Social Media Act does not get us any closer to a safer internet for kids and teens. Instead, if this legislation passes, parents will face the same exact conundrum they face today: Do they allow their kids to use social media and be exposed to serious online harms, or do they isolate their children from their peers? We need legislative solutions that put the burden on companies to make their platforms safer, less exploitative, and less addictive, instead of putting even more on parents’ plates.”"It’s critical that social media platforms are held accountable for the harmful impacts their practices have on children and teens. However, this bill’s approach is misguided. It places too much of a burden on parents, instead of focusing on platforms’ business practices that have produced the unprecedented public health crisis that harms our children’s physical and mental well-being. Kids and teens should not be locked out of our digital worlds, but be allowed online where they can be safe and develop in age-appropriate ways. One of the unintended consequences of this bill will likely be a two-tiered online system, where poor and otherwise disadvantaged parents and their children will be excluded from digital worlds. What we need are policies that hold social media companies truly accountable, so all young people can thrive,” said Katharina Kopp, Ph.D., Deputy Director of the Center for Digital Democracy.schatz-cotton_bill_coalition_statement.pdf
  • Citing research that illustrates a number of serious risks to children and teens in the Metaverse, advocates say Meta must wait for more research and root out dangers before targeting youth in VR. BOSTON, MA, WASHINGTON, DC and LONDON, UK — Friday, April 14, 2023 — Today, a coalition of over 70 leading experts and advocates for health, privacy, and children’s rights are urging Meta to abandon plans to allow minors between the ages of 13 and 17 into Horizon Worlds, Meta’s flagship virtual reality platform. Led by Fairplay, the Center for Digital Democracy (CDD), and the Center for Countering Digital Hate (CCDH), the advocates underscored the dearth of research on the impact of time spent in the Metaverse on the health and wellbeing of youth as well as the company’s track record of putting profits ahead of children’s safety. The advocates’ letter maintained that the Metaverse is already unsuitable for use by children and teens, citing March 2023 research from CCDH which revealed that minors already using Horizon Worlds were routinely exposed to harassment and abuse—including sexually explicit insults and racist, misogynistic, and homophobic harassment—and other offensive content. In addition to the existing risks present in Horizon Worlds, the advocates’ letter outlined a variety of potential risks facing underage users in the Metaverse, including magnified risks to privacy through the collection of biomarkers, risks to youth mental health and wellbeing, and the risk of discrimination, among others.In addition to Fairplay, CDD, and CCDH, the 36 organizations signing on include Common Sense Media, the Electronic Privacy Information Center (EPIC), Public Citizen, and the Eating Disorders Coalition.The 37 individual signatories include: Richard Gephardt of the Council for Responsible Social Media, former Member of Congress and House Majority Leader; Sherry Turkle, MIT Professor and author of Alone Together and Reclaiming Conversation; and social psychologist and author Jonathan Haidt.Josh Golin, Executive Director, Fairplay:“It's beyond appalling that Mark Zuckerberg wants to save his failing Horizons World platform by targeting teens. Already, children are being exposed to homophobia, racism, sexism, and other reprehensible content on Horizon Worlds. The fact that Mr. Zuckerberg is even considering such an ill-formed and dangerous idea speaks to why we need Congress to pass COPPA 2.0 and the Kids Online Safety Act.”Katharina Kopp, PhD, Deputy Director, Center for Digital Democracy:“Meta is demonstrating once again that it doesn’t consider the best interest of young people when it develops plans to expand its business operations.  Before it considers opening its Horizon Worlds metaverse operation to teens, it should first commit to fully exploring the potential consequences.  That includes engaging in an independent and research-based effort addressing the impact of virtual experiences on young people’s mental and physical well-being, privacy, safety, and potential exposure to hate and other harmful content.  It should also ensure that minors don’t face forms of discrimination in the virtual world, which tends to perpetuate and exacerbate ‘real life’ inequities.”Mark Bertin, MD, Assistant Professor of Pediatrics at New York Medical College, former Director of Developmental Behavioral Pediatrics at the Westchester Institute for Human Development, author of The Family ADHD Solution, Mindful Parenting for ADHD, and How Children Thrive:“This isn't like the panic over rock and roll, where a bunch of old folks freaked out over nothing. Countless studies already describe the harmful impact of Big Tech products on young people, and it’s worsening a teen mental health crisis. We can't afford to let profit-driven companies launch untested projects targeted at kids and teens and let families pick up the pieces after. It is crucial for the well-being of our children that we understand what is safe and healthy first.” Imran Ahmed, CEO of the Center for Countering Digital Hate:“Meta is making the same mistake with Horizon Worlds that it made with Facebook and Instagram. They have prioritized profit over safety in their design of the product, failed to provide meaningful transparency, and refused to take responsibility for ensuring worlds are safe, especially for children.“Yet again, their aim is speed to market in order to achieve monopoly status – rather than building truly sustainable, productive and enjoyable environments in which people feel empowered and safe.“Whereas, to some, ‘move fast and break things’ may have appeared swashbuckling from young startup entrepreneurs, it is a brazenly irresponsible strategy coming from Meta, one of the world’s richest companies. It should have learned lessons from the harms their earlier products imposed on society, our democracies and our citizens.”horizonletter.pdf
    Jeff Chester
     by
  • Reports indicate FTC plans to advance case against Amazon for violation of kids’ privacy after advocates’ 2019 complaint. BOSTON, MA and WASHINGTON, DC — Friday, March 31, 2023 — Following a groundbreaking investigation of Amazon’s Echo Dot Kids by Fairplay and Center for Digital Democracy (CDD), the Federal Trade Commission is preparing to advance a case against Amazon for the company’s violations of children’s privacy law to the Department of Justice. According to new reporting from Politico, the case centers on Amazon’s violations of the Children’s Online Privacy Protection Act (COPPA) through its Alexa voice assistant.In 2019, privacy advocates Fairplay and CDD called for the FTC to take action against Amazon after an investigation of the company’s Echo Dot Kids smart home assistant, a candy-colored version of Amazon’s flagship home assistant with Alexa voice technology. The investigationrevealed a number of shocking illegal privacy violations, including Amazon’s indefinite retention of kids’ sensitive data even after parents requested for it to be deleted. Now, reports indicate that the FTC is acting on the advocates’ calls for investigation.“We’re thrilled that the Federal Trade Commission and Department of Justice are close to taking action against Amazon for its egregious violations of children’s privacy,” said Josh Golin, Executive Director of Fairplay. “We know it’s not just social media platforms and apps thatmisuse children’s sensitive data. This landmark case would be the first time the FTC sanctioned the maker of a voice-enabled device for flouting COPPA. Amazon and its Big Tech peers must learn that COPPA violations are not just a cost of doing business.” “It is time for the FTC to address the rampant commercial surveillance of children via Internet of Things (IoT) devices, such as Amazon’s Echo, and enforce existing law,” said Katharina Kopp, Director of Policy at Center for Digital Democracy. “Children are giving away sensitive personal data on a massive scale via IoT devices, including their voice recordings and data gleaned from kids’ viewing, reading, listening, and purchasing habits. These data practices lead to violating children’s privacy, to manipulating them into being interested in harmful products, undermining their autonomy, and to perpetuating discrimination and bias. Both the FTC and the Department of Justice must hold Amazon accountable.”[see attached for additional comments] ftc_amazon_investigation_statement_fairplay_cdd.pdf
    Jeff Chester
  • Consumer Advocates Urge Action Walmart Deceptively Marketing to Kids on RobloxConsumer Advocates Urge ActionMADISON, CONN. January 23, 2023 – A coalition of advocacy groups led by ad watchdog truthinadvertising.org (TINA.org) is urging the Children’s Advertising Review Unit (CARU) – a BBB National Program – to immediately audit the Walmart Universe of Play advergame, a recent addition to the self-regulatory group’s COPPA Safe Harbor Program and bearer of one of the Program’s certification seals. According to a letter from TINA.org, Fairplay, Center for Digital Democracy and the National Association of Consumer Advocates, a copy of which was sent to Walmart, Roblox and the FTC, the retail giant is exposing children to deceptive marketing on Roblox, the online gaming and creation platform used by millions of kids on a daily basis.Walmart’s first foray into the Roblox metaverse came last September, when it premiered two experiences, Walmart Universe of Play and Walmart Land, which collectively have been visited more than 12 million times. Targeted at – and accessible to – young children on Roblox, Universe of Play features virtual products and characters from L.O.L. Surprise!, Jurassic World, Paw Patrol, and more and is advertised to allow kids to play with the “year’s best toys” and make a “wish list” of toys that can then be purchased at Walmart.As the consumer groups warn, Walmart completely blurs the distinction between advertising content and organic content, and simultaneously fails to provide clear or conspicuous disclosures that Universe of Play (or content within the virtual world) are ads. In addition, as kids’ avatars walk through the game, they are manipulated into opening additional undisclosed advertisements disguised as surprise wrapped gifts.To make matters worse, Walmart is using the CARU COPPA Safe Harbor Program seal to convey the false message that its children’s advergame is not only in compliance with COPPA (Children’s Online Privacy Protection Act), but CARU's Advertising Guidelines and truth-in-advertising laws, as well as a shield against enforcement action.“Walmart’s brazen use of stealth marketing directed at young children who are developmentally unable to recognize the promotional content is not only appalling, it’s deceptive and against truth-in-advertising laws. We urge CARU to take swift action to protect the millions of children being manipulated by Walmart on a daily basis.” Laura Smith, TINA.org Legal Director“Walmart's egregious and rampant manipulation of children on Roblox -- a platform visited by millions of children every day -- demands immediate action. The rise of the metaverse has enabled a new category of deceptive marketing practices that are harmful to children. CARU must act now to ensure that children are not collateral damage in Walmart's digital drive for profit.” Josh Golin, Executive Director, Fairplay“Walmart’s and Roblox’s practices demonstrate that self-regulation is woefully insufficient to protect children and teens online. Today, young people are targeted by a powerful set of online marketing tactics that are manipulative, unfair, and harmful to their mental and physical health. Digital advertising operates in a ‘wild west’ world where anything goes in terms of reaching and influencing the behaviors of kids and teens. Congress and the Federal Trade Commission must enact safeguards to protect the privacy and well-being of a generation of young people.” Katharina Kopp, Director of Policy, Center for Digital DemocracyTo read more about Walmart’s deceptive marketing on Roblox see: /articles/tina-org-urges-action-against-walmarts-undisclosed-advergame-on-robloxAbout TINA.org (truthinadvertising.org) TINA.org is a nonprofit organization that uses investigative journalism, education, and advocacy to empower consumers to protect themselves against false advertising and deceptive marketing.About Fairplay Fairplay is the leading nonprofit organization committed to helping children thrive in an increasingly commercialized, screen-obsessed culture, and the only organization dedicated to ending marketing to children.About Center for Digital DemocracyThe Center for Digital Democracy is a nonprofit organization using education, advocacy, and research into commercial data practices to ensure that digital technologies serve and strengthen democratic values, institutions, and processes.About National Association of Consumer AdvocatesThe National Association of Consumer Advocates is a nonprofit association of more than 1,500 attorneys and consumer advocates committed to representing consumers’ interests.For press inquiries contact: Shana Mueller at 203.421.6210 or press@truthinadvertising.org.walmart_caru_press_release_final.pdf
  • Josh Golin, executive director, Fairplay:The FTC’s landmark settlement against Epic Games is an enormous step forward towards creating a safer, less manipulative internet for children and teens. Not only is the Commission holding Epic accountable for violating COPPA by illegally collecting the data of millions of under 13-year-olds, but the settlement is also a shot across the bow against game makers who use unfair practices to drive in-game purchases by young people. The settlement rightly recognizes not only that unfair monetization practices harm young people financially, but that design choices used to drive purchases subject young people to a wide array of dangers, including cyberbullying and predation.Today’s breakthrough settlement underscores why it is so critical that Congress pass the privacy protections for children and teens currently under consideration for the Omnibus bill. These provisions give teens privacy rights for the first time, address unfair monetization by prohibiting targeted advertising, and empower regulators by creating a dedicated youth division at the FTC. Jeff Chester, executive director, Center for Digital Democracy:Through this settlement with EPIC Games using its vital power to regulate unfair business practices, the FTC has extended long-overdue and critically important online protections for teens.  This tells online marketers that from now on, teenagers cannot be targeted using unfair and manipulative tactics designed to take advantage of their young age and other vulnerabilities.Kids should also have their data privacy rights better respected through this enforcement of the federal kids data privacy law (COPPA).  Gaming is a “wild west” when it comes to its data gathering and online marketing tactics, placing young people among the half of the US population who play video games at especially greater risk.  While today’s FTC action creates new safeguards for young people, Congress has a rare opportunity to pass legislation this week ensuring all kids and teens have strong digital safeguards, regardless of what online service they use.
    Jeff Chester
  • A coalition of more than 100 organizations is sending two letters to Congress urging action. A letter addressed to Senate Majority Leader Chuck Schumer and Minority Leader Mitch McConnell, from 145 organizations, urges them to advance KOSA and COPPA to full Senate votes. A letter addressed to House Energy and Commerce Chair Frank Pallone and Ranking Member Cathy McMorris Rodgers, from 158 organizations, urges them to introduce a House companion bill to KOSA. The advocates state in the letter to the Senate: “The enormity of the youth mental health crisis needs to be addressed as the very real harms of social media are impacting our children today. Taken together, the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act would prevent online platforms from exploiting young users’ developmental vulnerabilities and targeting them in unfair and harmful ways.” kosa_coppa_senate_leadership_letter_final_9.12.22-1.pdf, eandc_leadership_kosa_letter_final_9.12.22-1.pdf, kosa_coppa_rally_press_release_embargo_to_9_13.pdf
    person using smartphone by Priscilla Du Preez
  • Press Statement regarding today’s FTC Notice(link is external) of Proposed Rulemaking Regarding the Commercial Surveillance and Data SecurityKatharina Kopp, Deputy Director, Center for Digital Democracy:Today, the Federal Trade Commission issued its long overdue advanced notice of proposed rulemaking (ANPRM) regarding a trade regulation rule on commercial surveillance and data security. The ANPRM aims to address the prevalent and increasingly unavoidable harms of commercial surveillance. Civil society groups including civil rights groups, privacy and digital rights and children’s advocates had previously called on the commission to initiate this trade regulation rule to address the decades long failings of the commission to reign in predatory corporate practices online. CDD had called on the commission repeatedly over the last two decades to address the out-of-control surveillance advertising apparatus that is the root cause of increasingly unfair, manipulative, and discriminatory practices harming children, teens, and adults and which have a particularly negative impact on equal opportunity and equity.The Center for Digital Democracy welcomes this important initial step by the commission and looks forward to working with the FTC. CDD urges the commission to move forward expeditiously with the rule making and to ensure fair participation of stakeholders, particularly those that are disproportionately harmed by commercial surveillance.press_statement_8-11fin.pdf
  • CDD Comments to FTC for "Stealth" Marketing Inquiry The Center for Digital Democracy (CDD) urges the FTC to develop and implement a set of policies designed to protect minors under 18 from being subjected to a host of pervasive, sophisticated and data-driven digital marketing practices. Children and teens are targeted by an integrated set of online marketing operations that are manipulative, unfair, invasive and can be especially harmful to their mental and physical health. The commission should make abundantly clear at the forthcoming October workshop that it understands that the many problems generated by contemporary digital marketing to youth transcend narrow categories such as “stealth advertising” and “blurred content.” Nor should it propose “disclosures” as a serious remedy, given the ways advertising is designed using data science, biometrics, social relationships and other tactics. Much of today’s commercially supported online system is purposefully developed to operate as “stealth”—from product development, to deployment, to targeting, tracking and measurement. Age-based cognitive development capacities to deal with advertising, largely based on pre-digital (especially TV) research, simply don’t correspond to the methods used today to market to young people. CDD calls on the commission to acknowledge that children and teenagers have been swept into a far reaching commercial surveillance apparatus.The commission should propose a range of safeguards to protect young people from the current “wild west” of omnichannel directed at them. These safeguards should address, for example, the role market research and testing of child and teen-directed commercial applications and messaging play in the development of advertising; how neuromarketing[pdf] practices designed to leverage a young person’s emotions and subconscious are used to deliver “implicit persuasion”; the integration by marketers and platforms of “immersive” applications, including augmented and virtual reality, designed to imprint brand and other commercial messages; the array of influencer-based strategies, including the extensive infrastructure used by platforms and advertisers to deliver, track and measure their impact; the integration of online marketing with Internet of Things objects, including product packaging and the role of QR codes, (experiential marketing) and digital out-of-the-home advertising screens; as well as contemporary data marketing operations that use machine learning and artificial intelligence to open up new ways for advertisers to reach young people online. AI services increasingly deliver personalized content online, further automating the advertising process to respond in real-time.It is also long overdue for the FTC to investigate and address how online marketing targets youth of color, who are subjected to a variety of advertising practices little examined by privacy and other regulators.The FTC should use all its authority and power to stop data-driven surveillance marketing to young people under 18; end the role sponsored influencers play; enact rules designed to protect the online privacy for teens 13-17 who are now subjected to ongoing tracking by marketers; and propose policies to redress the core methods employed by digital advertisers and online platforms to lure both children and teens. For more than 20 years, CDD and its allies have urged the FTC to address the ways digital marketing has undermined consumer protection and privacy, especially for children and adolescents. Since the earliest years of the commercial internet, online marketers have focused on young people, both for the revenues they deliver as well as to secure loyalty from what the commercial marketing industry referred to as “native” users. The threat to their privacy, as well as to their security and well-being, led to the complaint our predecessor organization filed in 1996, which spurred the passage of the Children’s Online Privacy Protection Act (COPPA) in 1998. COPPA has played a modest role protecting some younger children from experiencing the totality of the commercial surveillance marketing system. However, persistent failures of the commission to enforce COPPA; the lack of protections for adolescents (despite decades-long calls by advocates for the agency to act on this issue); and a risk-averse approach to addressing the methods employed by the digital advertising, even when applied to young people, have created ongoing threats to their privacy, consumer protection and public health. In this regard, we urge the commission to closely review the comments submitted in this proceeding by our colleague Fairplay and allies. We are pleased Fairplay supports these comments.If the FTC is to confront how the forces of commercial digital surveillance impact the general public, the building blocks to help do so can be found in this proceeding. Young people are exposed to the same unaccountable forces that are everywhere online: a largely invisible, ubiquitous, and machine-intelligence-driven system that tracks and assesses our every move, using an array of direct and indirect techniques to influence behaviors. If done correctly, this proceeding can help inform a larger policy blueprint for what policy safeguards are needed—for young people and for everyone else.The commission should start by reviewing how digital marketing and data-gathering advertising applications are “baked in” at the earliest stages of online content and device development. These design and testing practices have a direct impact on young people. Interactive advertising standards groups assess and certify a host of approved ad formats, including for gaming, mobile, native advertising, and streaming video. Data practices for digital advertising, including ways that ads are delivered through the behavioral/programmatic surveillance engines, as well as their measurement, are developed through collaborative work involving trade organizations and leading companies. Platforms such as Meta, as well as ad agencies, adtech companies, and brands, also have their own variations of these widely adopted formats and approaches. The industry-operated standards process for identifying new methods for digital advertising, including the real-world deployment of applications such “playable” ads or the ways advertisers can change its personalized messaging in real-time, have never been seriously investigated by the commission. A review of the companies involved show that many are engaged in digital marketing to young people.Another critical building block of contemporary digital marketing to address when dealing with youth-directed advertising is the role of “engagement.” As far back as 2006, the Interactive Advertising Bureau (IAB) recognized that to effectively secure the involvement of individuals with marketing communications, at both the subconscious and conscious levels, it was necessary to define and measure the concept of engagement. IAB initially defined “Engagement… [as] turning on a prospect to a brand idea enhanced by the surrounding context..” By 2012, there were more elaborate definitions identifying “three major forms of engagement… cognitive, physical and emotional.” A set of corresponding metrics, or measurement tools, were used, including those tracking “attention” (“awareness, interest, intention”); emotional and motor functioning identified through biometrics (“heart palpitations, pupil dilation, eye tracking”); and through omnipresent tracking of online behaviors (“viewability and dwell time, user initiated interaction, clicks, conversions, video play rate, game play”). Today, research and corresponding implementation strategies for engagement are an ongoing feature for the surveillance-marketing economy. This includes conducting research and implementing data-driven and other ad strategies targeting children—known as “Generation Alpha”—children 11 and younger—and teens—“Generation Z.”We will briefly highlight some crucial areas this proceeding should address:Marketing and product research on children and adolescents: An extensive system designed to ensure that commercial online content, including advertising and marketing, effectively solicits the interest and participation of young people, is a core feature of the surveillance economy. A host of companies are engaged in multi-dimensional market research, including panels, labs, platforms, streaming media companies, studios and networks, that have a direct impact on the methods used to advertise and market to youth. CDD believes that such product testing, which can rely on a range of measures designed to promote “implicit persuasion” should be considered an unfair practice generally. Since CDD and U.S. PIRG first urged the commission to investigate neuromarketing more than a decade ago, this practice has in ways that enable it to play a greater role influencing how content and advertising is delivered to young people.For example, MediaScience (which began as the Disney Media and Advertising Lab), serves major clients including Disney, Google, Warner Media, TikTok, Paramount, Fox and Mars. It conducts research for platforms and brands using such tools as “neurometrics (skin conductivity and heart rate), eye tracking, facial coding, and EEGs, among others, that assess a person’s responses across devices. Research is also conducted outside of the lab setting, such as directly through a subject’s “actual Facebook feed.” It has a panel of 80,000 households in the U.S., where it can deliver digital testing applications using a “variety of experimental designs… facilitated in the comfort of people’s homes.” The company operates a “Kids” and “Teens” media research panel. Emblematic of the far-reaching research conducted by platforms, agencies and brands, in 2021 TikTok’s “Marketing Science team” commissioned MediaScience to use neuromarketing research to test “strong brand recall and positive sentiment across various view durations.” The findings indicated that “ads on TikTok see strong brand recall regardless of view duration…. Regardless of how long an ad stays on screen, TikTok draws early attention and physiological engagement in the first few seconds.”NBCUniversal is one of the companies leveraging the growing field of “emotional analytics” to help advance advertising for streaming and other video outlets. Comcast’s NBCU is using “facial coding and eye-tracking AI to learn an audience’s emotional response to a specific ad.” Candy company Mars just won a “Best Use of Artificial Intelligence” award for its “Agile Creative Expertise (ACE) tool that “tracks attentional and emotional response to digital video ads.” Mars is partnering with neuromarketer Realeyes to “measure how audience’s attention levels respond as they view Mars' ads.Knowing what captures and retains attention or even what causes distraction, generated intelligence that enabled Mars to optimize the creative itself or the selection of the best performing ads across platforms including TikTok, Facebook, Instagram and YouTube.” TikTok, Meta/Facebook, and Google have all used a variety of neuromarketing measures. The Neuromarketing Science and Business Association (NMSBA) includes many of the leading companies in this field as members. There is also an “Attention Council” within the digital marketing industry to help advance these practices, involving Microsoft, Mars, Coca-Cola, AB/InBev, and others. A commercial research infrastructure provides a steady drumbeat of insights so that marketers can better target young people on digital devices. Children’s streaming video company Wildbrain, for example, partnered with Ipsos for its 2021 research report, “The Streaming Generation,” which explained that “Generation Alpha [is] the most influential digital generation yet…. They have never known a world without digital devices at their fingertips, and for Generation Alpha (Gen A), these tech-first habits are now a defining aspect of their daily lives.” More than 2,000 U.S. parents and guardians of children 2-12 were interviewed for the study, which found that “digital advertising to Gen A influences the purchasing decisions of their parents…. Their purchasing choices, for everything from toys to the family car, are heavily influenced by the content kids are watching and the ads they see.” The report explains that among the “most popular requests” are toys, digital games, clothing, tech products and “in-game currencies” for Roblox and Fortnite.Determining the levels of “brand love” by children and teens, such as the use of “Kidfinity” and “Teenfinity” scores—“proprietary measures of brand awareness, popularity and love”—are regularly provided to advertisers. Other market researchers, such as Beano Studios, offer a “COPPA-compliant” “Beano Brain Omnibus” website that, through “games, quizzes, and bespoke questions” for children and teens, “allows bands to access answers to their burning questions.” These tools help marketers better identify, for example, the sites—such as TikTok—where young people spend time. Among the other services Beano provides, which reflect many other market-research companies’ capabilities, are “Real-time UX/UI and content testing—in the moment, digital experience exploration and evaluation of brands websites and apps with kids and teens in strawman, beta or live stages,” and “Beano at home—observing and speaking to kids in their own homes. Learning how and what content they watch.” Adtech and other data marketing applications: In order to conduct any “stealth” advertising inquiry, the FTC should review the operations of contemporary “Big Data”-driven ad systems that can impact young people. For example, Disney has an extensive and cutting-edge programmatic apparatus called DRAX(Disney Real-Time Ad Exchange) that is delivering thousands of video-based campaigns. DRAX supports “Disney Select,” a "suite of ad tech solutions, providing access to an extensive library of first-party segments that span the Disney portfolio, including streaming, entertainment and sports properties…. Continuously refined and enhanced based on the countless ways Disney connects with consumers daily. Millions of data inputs validated through data science…. Advertisers can reach their intended audiences by tapping into Disney’s proprietary Audience Graph, which unifies Disney’s first party data and audience modeling capabilities….” As of March 2022, Disney Select contained more than 1,800 “audience segments built from more than 100,000 audience attributes that fuel Disney’s audience graph.” According to Disney Advertising, its “Audience Graph” includes 100 million households, 160 million connected TV devices and 190 million device IDs, which enables modeling to target households and families. Children and teens are a core audience for Disney, and millions of their households receive its digital advertising. Many other youth-directed leading brands have developed extensive internal adtech applications designed to deliver ongoing and personalized campaigns. For example, Pepsi, Coca-Cola, McDonald’s, and Mondelez have in-house capabilities and extensive partnerships that create targeted marketing to youth and others. The ways that “Big Data” analytics affect marketing, especially how insights can be used to target youth, should be reviewed. Marketers will say to the FTC that they are only targeting 18-year-olds and over, but an examination of their actual targets, and asking for child-related brand-safety data they collect, should provide the agency with a robust response to such claims.New methods to leverage a person’s informational details and then target them, especially without “cookies,” requires the FTC to address how this is being used to market to children and teens. This review should also be extended to “contextual” advertising, since that method has been transformed through the use of machine learning and other advanced tactics—called “Contextual 2.0.”Targeting youth of color: Black, Hispanic, Asian-American and other “multicultural” youth, as the ad industry has termed it, are key targets for digital advertising. An array of research, techniques, and services is focused on these young people, whose behaviors online are closely monitored by advertisers. A recent case study to consider is the McDonald’s U.S. advertising campaign designed to reverse its “decline with multicultural youth.” The goal of its campaign involving musician Travis Scott was to “drive penetration by bringing younger, multicultural customers to the brands… and drive immediate behavior too.” As a case study explains, “To attract multicultural youth, a brand… must have cultural cachet. Traditional marketing doesn’t work with them. They don’t watch cable TV; they live online and on social media, and if you are not present there you’re out of sight, out of mind.”It’s extremely valuable to identify some of the elements involved in this case, which are emblematic of the integrated set of marketing and advertising practices that accompany so many campaigns aimed at young people. These included working with a celebrity/influencer who is able to “galvanize youth and activate pop culture”; offering “coveted content—keepsakes and experiences to fuel the star’s fanbase, driving participation and sales”; employing digital strategies through a proprietary (and data-collecting) “app to bring fans something extra and drive digital adoption”; and focusing on “affordability”—to ensure “youth with smaller wallets” would participate. To illustrate how expenditures for paid advertising are much less relevant with digital marketing, McDonald’s explains that “Before a single dollar had been spent on paid media, purely on the strength of a few social posts by McDonald’s and Travis Scott, and reporting in the press, youth were turning up at restaurants across the country, asking for the Travis Scott meal.” This campaign was a significant financial success for McDonald’s. Its partnership with this influencer was effective as well in terms of “cultural response: hundreds of thousands of social media mentions and posts, fan-art and memes, unboxing videos of the meal…, fans selling food and stolen POS posters on eBay…, the multi merch drops that sold out in seconds, the framed receipts.” Online ads targeted to America’s diverse communities of young people, who can also be a member of a group at risk (due to finances, health, and the like) have long required an FTC investigation. The commission should examine the data-privacy and marketing practices on these sites, including those that communicate via languages other than English.Video and Video Games: Each of these applications have developed an array of targeted advertising strategies to reach young people. Streaming video is now a part of the integrated surveillance-marketing system, creating a pivotal new place to reach young people, as well as generate data for further targeting. Children and teens are viewing video content on Smart TVs, other streaming devices, mobile phones, tablets as well as computers. Household data where young people reside, which is amplified through the use of a growing number of “identity” tools that permit cross-device tracking, enable an array of marketing practices to flourish. The commission should review the data-gathering, ad-formatting, and other business practices that have been identified for these “OTT” services and how they impact children and teens. There are industry-approved ad-format guidelines for digital video and Connected TV. Digital video ads can use “dynamic overlays,” “shoppable and actionable video,” “voice-integrated video ads,” “sequential CTV creative,” and “creative extensions,” for example. Such ad formats and preferred practices are generally not vetted in terms of how they impact the interests of young people.Advertisers have strategically embedded themselves within the video game system, recognizing that it’s a key vantage point to surveil and entice young people. One leading quick-service restaurant chain that used video games to “reach the next generation of fast-food fans” explained that “gaming has become the primary source of entertainment for the younger generation. Whether playing video games or watching others play games on social platforms, the gaming industry has become bigger than the sports and music industries combined. And lockdowns during the global pandemic accelerated the trend. Gaming is a vital part of youth culture.” Illustrating that marketers understand that traditional paid advertising strategies aren’t the most effective to reach young people, the fast-food company decided to “approach gaming less like an advertising channel and more like an earned social and PR platform…. [V]ideo games are designed as social experiences.” As Insider Intelligence/eMarketer reported in June 2022, “there’s an ad format for every brand” in gaming today, including interstitial ads, rewarded ads, offerwalls, programmatic in-game ads, product placement, advergames, and “loot boxes.” There is also an “in-game advertising measurement” framework, recently released for public comment by the IAB and the Media Ratings Council. This is another example where leading advertisers, including Google, Microsoft, PepsiCo and Publicis, are determining how “ads that appear within gameplay” operate. These guidelines will impact youth, as they will help determine the operations of such ad formats as “Dynamic In-Game Advertising (DIGA)—Appear inside a 3D game environment, on virtual objects such as billboards, posters, etc. and combine the customization of web banners where ads rotate throughout the play session”; and “Hardcoded In-Game Ad Objects: Ads that have not been served by an ad server and can include custom 3D objects or static banners. These ads are planned and integrated into a video game during its design and development stage.” Leading advertising platforms such as Amazon sell as a package video ads reaching both streaming TV and gaming audiences. The role of gaming and streaming should be a major focus in October, as well as in any commission follow-up report.Influencers: What was once largely celebrity-based or word-of mouth style endorsements has evolved into a complex system including “nano-influencers (between 1,000 and 10,000 followers); micro-influencers (between 10,000 and 100,000); macro-influencers (between 100,000 and a million); and mega or celebrity influencers (1 million-plus followers). According to a recent report in the Journal of Advertising Research, “75 percent of marketers are now including social-media influencers in their marketing plans, with a worldwide market size of $2.3 billion in 2020.” Influencer marketing is also connected to social media marketing generally, where advertisers and others have long relied on a host of surveillance-related systems to “listen,” analyze and respond to people’s social online communications.Today, a generation of “content creators” (aka influencers) is lured into becoming part of the integrated digital sales force that sells to young people and others. From “unboxing videos” and “virtual product placement” in popular content, to “kidfluencers” like Ryan’s World and “brand ambassadors” lurking in video games, to favorite TikTok creators pushing fast-food, this form of digital “payola” is endemic online.Take Ryan’s World. Leveraging “more than one billion views” on YouTube, as well as a Nickelodeon show, has “catapulted him... to a global multi-category force,” notes his production and licensing firm. The deals include a “preschool product line in multiple categories, “best in class partnerships, and a “Tag with Ryan” app that garnered 16 million downloads. Brands seeking help selling products, says Ryan’s media agency, “can connect with its kid fanbase of millions that leverages our world-class portfolio of kid-star partners to authentically and seamlessly connect your brand with Generation Alpha across YouTube, social media, mobile games, and OTT channels—everywhere kids tune in!... a Generation Alpha focused agency that delivers more than 8 BILLION views and 100 MILLION unique viewers every month!” (its emphasis). Also available is a “custom content and integrations” feature that can “create unique brand experiences with top-tier kid stars.” Ryan’s success is not unique, as more and more marketers create platforms and content, as well as merge companies, to deliver ads and marketing to children and teens. An array of influencer marketing platforms that offer “one-stop” shopping for brands to employ influencers, including through the use of programmatic marketing-like data practices (to hire people to place endorsements, for example) is a core feature of the influencer economy. There are also software programs so brands and marketers can automate their social influencer operations, as well as social media “dashboards” that help track and analyze social online conversations, brand mentions and other communications. The impact of influencers is being measured through a variety of services, including neuromarketing. Influencers are playing a key role in “social commerce,” where they promote the real-time sales of products and services on “shoppable media.” U.S. social commerce sales are predicted to grow to almost $80 billion in 2025 from its 2022 estimated total of $45.74 billion. Google, Meta, TikTok, Amazon/Twitch and Snapchat all have significant influencer marketing operations. As Meta/Facebook recently documented, there is also a growing role for “virtual” influencers that are unleashed to promote products and services. While there may be claims that many promotions and endorsements should be classified as “user generated content” (UGC), we believe the commission will find that the myriad influencer marketing techniques often play a role spurring such product promotion.The “Metaverse”: The same forces of digital marketing that have shaped today’s online experience for young people are already at work organizing the structure of the “metaverse.” There are virtual brand placements, advertisements, and industry initiatives on ad formats and marketing experiences. Building on work done for gaming and esports, this rapidly emerging marketing environment poses additional threats to young people and requires timely commission intervention.Global Standards: Young people in the U.S. have fewer protections than they do in other countries and regions, including the European Union and the United Kingdom. In the EU, for example, protections are required for young people until they are 18 years of age. The impact of the GDPR, the UK’s Design Code, the forthcoming Digital Services Act (and even some self-regulatory EU initiatives by companies such as Google) should be assessed. In what ways do U.S.-based platforms and companies provider higher or more thorough safeguards for children when they are required to do so outside of this country? The FTC has a unique role to ensure that U.S. companies operating online are in the forefront—not in the rear—of protecting the privacy and interests of children.The October Workshop: Our review of the youth marketing landscape is just a partial snapshot of the marketplace. We have not discussed “apps” and mobile devices, which pose many concerns, including those related to location, for example. But CDD hopes this comment will help inform the commission about the operations of contemporary marketing and its relationship to young people. We call on the FTC to ensure that this October, we are presented with an informed and candid discussion of the nature and impact of today’s marketing system on America’s youth.ftcyouthmarketing071822.pdf
    Jeff Chester