CDD Filings
Program Areas
-
Comments submitted by Fairplay, the Center for Digital Democracy, the American Academy of Pediatrics, in response to the COPPA proposed Notice of Proposed Rulemaking issued in December 2023 by the Federal Trade Commission
-
Advocates File Amicus in Support of the California Age-Appropriate Design Code Act
Groups explain to court in NetChoice case the ways commercial surveillance marketers track & target kids
Today a coalition of groups and individuals filed an Amicus Brief in support of the CAADCA, including Fairplay Inc., Center for Digital Democracy, Common Sense, 5Rights Foundation, Children’s Advocacy Institute, Accountable Tech, Beyond the Screen, Children & Screens, Design It For Us, The Tyler Clementi Foundation, Becca Schmill Foundation, Arturo Béjar, Frances Haugen. -
September 18, 2023 Comment on the 2023 Merger GuidelinesCenter for Digital DemocracyFTC-2023-0043 The Center for Digital Democracy (CDD) urges the U.S. Department of Justice (DoJ) and the Federal Trade Commission (FTC) to adopt the proposed merger guidelines. The guidelines are absolutely necessary to ensure the U.S. operates a 21st century antitrust regime and doesn’t keep repeating the mistakes of the last several decades. Failures to understand and address contemporary practices, especially related to data assets, has brought us further consolidation in key markets, including in the digital media. Over rhe decades, CDD has been at the forefront of NGOs sounding the alarm on the consolidation of the digital marketing and advertising industry, including our opposition to such transactions as the Google/Doubleclick merger, Facebook/Instagram, Google/YouTube, Google/AdMob, Oracle/BlueKai and Datalogix, among others. Regulatory approval for these deals has accelerated the consolidation of the online media marketplace, where a tiny handful of companies—Alphabet (Google), Meta and Amazon—dominate the marketplace in terms of advertising revenues and online marketing applications. It has also helped deliver today’s vast commercial surveillance marketplace, with its unrelenting collection and use of information from consumers, small businesses and other potential competitors. The failure to address effectively the role that data assets and processing capabilities play in merger transactions has had unfortunate systemic consequences for the U.S. public. Privacy has been largely lost as a result, since by permitting these data-related deals both agencies signaled that policymakers approved unfettered data-driven commercial surveillance operations. It has also led to the widespread adoption by the largest commercial entities and brands, across all market verticals, to adopt the “Big Data” and personalized digital marketing applications developed by Google, Meta and Amazon—furthering the commercial surveillance stranglehold and helping fuel platform dominance. It has also had a profound and unfortunate impact on the structure of contemporary media, which have embraced the data-driven commercial surveillance paradigm with all its manipulative and discriminatory effects. In this regard, the failure to ensure meaningful antitrust policies has had consequences for the health of our democracy as well. The proposed guidelines should aid regulators better address specific transactions, their implications for specific markets, and the wider “network effects” that such digitally connected mergers trigger. An overall guideline for antitrust authorities should be an examination of the data assets assembled by each entity. Over the last half-decade or so, nearly every major company—regardless of “vertical” market served—has become a “big data” company, using both internal and external assets to leverage a range of data and decision intelligence designed to gather, process and make “actionable” data insights. Such affordances are regularly used for product development, supply, and marketing, among other uses. Artificial intelligence and machine learning applications are also “baked in” to these processes, extending the affordances across multiple operations. Antitrust regulators should inventory the data and digital assets of each proposed transaction entity, including data partnerships that extend capabilities; analyze them in terms of specific market capabilities and industry-wide standards; and review how a given combination might further anti-competitive effects (especially through leveraging data assets via cloud computing and other techniques). As markets further converge in the digital era, where, for example, data-driven marketing operations affect multiple sectors, we suggest that regulators will need to be both creative and flexible in addressing potential harms arising from cross-sectoral impacts. This point relates to Guideline 10 and “multi-sided” platforms. Regarding Guideline 3, we urge the agencies to review how both Alphabet/Google and Meta especially, as a result of prior merger approvals, have been able to determine how the broader online marketplace operates—creating a form of “coordination” problem. The advertising and data techniques developed by the two companies have had an inordinate influence over the development of online practices generally, in essence “dictating” formats, affordances, and market structures. By allowing Alphabet and Meta to grow unchecked, antitrust regulators have allowed the dog to wag the “long tail” of the digital marketplace. We also want to raise the issue of partnerships, since they are a very significant feature of the online market today. In addition to consolidation through acquisitions, companies have assembled a range of data and marketing partners who provide significant resources to these entities. This leveraging of the market through affiliates undermines competition (as well as compounding related issues involving privacy and consumer protection). The steady stream of acquisitions in rapidly evolving markets, such as “over-the-top” streaming video, that further entrenches dominant players and also creates new hurdles for potential competitors, raises the issue addressed in Guideline 8. Repeatedly, especially in digitally connected markets (such as media), there are daily acquisitions that clearly further consolidation. Today they go unchecked, something we hope will be reversed under the proposed paradigm here. Each proposed guideline is essential, in our view, to ensure that relevant information gathering and analysis are conducted for each proposed transaction. We are at a critical period of transition for markets, as data, digital media and technological progress (AI especially) continue to challenge traditional perspectives on dominance and competition. Broader network effects, regarding privacy, consumer protection and impact on democratic institutions should also be addressed by regulators moving forward. The proposed DoJ and FTC merger guidelines will provide critical guidance for the antitrust work to come.
-
In comments to the Federal Trade Commission, EPIC, the Center for Digital Democracy, and Fairplay urged the FTC to center privacy and data security risks as it evaluates Yoti Inc’s proposed face-scanning tool for obtaining verifiable parental consent under the Children’s Online Privacy Protection Act (COPPA).In a supplementary filing CDD urges the Federal Trade Commission (FTC) to reject the parent-consent method proposed by the applicants Entertainment Software Rating Board (ESRB) and EPIC Games’ SuperAwesome division. Prior to any decision, the FTC must first engage in due diligence and investigate the contemporary issues involving the role and use of facial coding technology and its potential impact on children’s privacy. The commission must have a robust understanding of the data flows and insight generation produced by facial coding technologies, including the debate over their role as a key source of “attention” metrics, which are a core advertising measurement modality. Since this proposal is designed to deliver a significant expansion of children’s data collection—given the constellation of brands, advertisers and publishers involved with the applicants and their child-directed market focus—a digital “cautionary” principle on this consent method is especially required here. Moreover, one of the applicants, as well as several key affiliates of the ESRB—EPIC Games, Amazon, and Microsoft—have recently been sanctioned for violating COPPA, and any approval in the absence of a thorough fact-finding here would be premature.
-
CDD tells FTC to apply strong data privacy and security rules for health data
Filing also focuses on role commercial surveillance marketers play targeting physicians and patients
The Center for Digital Democracy (CDD) endorses the Federal Trade Commission’s (FTC) proposal to better protect health consumer and patient information in the digital era. CDD warned the commission in 2010, as well as in its 2022 commercial surveillance comments, that health data—including information regarding serious medical conditions—are routinely (and cynically) gathered and used for online marketing. This has placed Americans at risk—for loss of their privacy, health-decision autonomy, and personal financial security. The commercial surveillance health data digital marketing system also triggers major strains on the fiscal well-being of federal and private health insurance systems, creating demand for products and services that can be unnecessary and costly. The commission should “turn off the tap” of data flooding the commercial surveillance marketplace, including both direct and inferred health information. The commission can systemically address the multiple data flows—including those on Electronic Health Record (EHR) systems—that require a series of controls. EHR, or personal health record systems, have served as a digital “Achilles heel” of patient privacy, with numerous commercial entities seizing that system to influence physicians and other prescribers as well as to gain insights used for ongoing tracking. The commercialization of health-connected data is ubiquitous, harvested from mobile “apps,” online accounts, loyalty programs, social media posts, data brokers, marketing clouds and elsewhere. Given the contemporary commercial data analytic affordances to generate insights and actions operational today, information gathered for other purposes can be used to generate health-related data. Health information can be combined with numerous other datasets that can reveal ethnicity, location, media use, etc., to create a robust target marketing profile. As programmatic advertising trade publication “AdExchanger” recently noted, “sensitive health data can be collected or revealed through dozens of noncovered entities, from location data providers to retail media companies. And these companies aren’t prevented from sharing data, unless the data was sourced from a covered entity.” The FTC’s Health Breach Notification Rule (HBNR) proposal comes at an especially crucial time for health privacy in the U.S. A recent report on “The State of Patient Privacy,” as noted by Insider Intelligence/eMarketer in July 2023, shows that a majority of Americans “distrust” the role that “Big Tech Companies” play with their health data. A majority of patients surveyed explained that “they are worried about security and privacy protections offered by vendors that handle their health data.” Ninety-five percent of the patients in the survey “expressed concern about the possibility of data breaches affecting their medical records.” These concerns, we suggest, reflect consumer unease regarding their reliance on the online media to obtain health information. For example, “half of US consumers use at least one health monitoring tool,” and “healthcare journeys often start online,” according to the “Digital Healthcare Consumer 2023” report. There is also a generational shift in the U.S. underway, where at least half of young adults (so-called Generation Z) now “turn to social media platforms for health-related purposes either all the time or often…via searches, hashtags QR codes…[and] have the highest rate of mobile health app usage.” The Covid-19 pandemic triggered greater use of health-related apps by consumers. So-called “telehealth” services generate additional data as well, including for online “lead generation.” The growing use of “digital pharmacies” is being attributed to the rising costs of medications—another point where consumer health data is gathered. The FTC should ensure the health data privacy of Americans who may be especially vulnerable—such as those confronting financial constraints, pre-existing or at-risk conditions, or have long been subjected to predatory and discriminatory marketing practices—and who are especially in need of stronger protections. These should include addressing the health-data-related operations from the growing phalanx of retail, grocery, “dollar,” and drug store chains that are expanding their commercial surveillance marketing operations (so-called “retail media”), while providing direct-to-consumer delivered health services. Electronic Health Record systems are a key part of the health and commercial surveillance infrastructure: EHRs have long served as “prime real estate for marketers…[via] data collection, which makes advanced targeting a built-in benefit of EHR marketing.” EHRs are used to influence doctors and other prescribers relying on what’s euphemistically called point-of-care marketing. Marketing services for pharmaceutical and other life science companies can be “contextually integrated into the EHR workflow [delivered] to the right provider at the right time within their EHR [using] awareness messaging targeted on de-identified real-time data specific to the patient encounter.” Such applications are claimed to operate as “ONC-certified and HIPPA-compliant (ONC stands for “Office of the National Coordinator for Health Information,” HHS). The various, largely unaccountable, methods used to target and influence how physicians treat their patients by utilizing EHRs raise numerous privacy and consumer protection issues. For example, “EHR ads can appear in several places at all the stages along the point-of-care journey,” one company explained. Through an “E-Prescribing Screen,” pharma companies are able to offer “co-pay coupons, patient savings offers and relevant condition brand messaging.” Data used to target physicians, including prescription information derived from a consumer, using EHR systems, help trigger more information from and about a health consumer (think about the subsequent role of drug stores, search engines and social media use, gathering of data for coupons, etc.). This “non-virtuous” circle of health surveillance should be subjected to meaningful health data breach and security safeguards. Patient records on EHRs must be safeguarded and the methods used to influence healthcare professionals require major privacy reforms. Contemporary health data systems reflect the structures that comprise the overall commercial surveillance apparatus, including databrokers, marketing clouds, AI: The use of digital marketing to target U.S. health consumers has long been a key “vertical” for advertisers. For example, there are numerous health-focused subsidiaries run by the leading global advertising agencies, all of which have extensive data-gathering and targeting capabilities. These include Publicis Health: “Our proprietary data and analytics community, paired with the unsurpassed strengths of Sapient and Epsilon allow us to deliver unmatched deterministic, behavioral, and transactional data, powered by AI.” IPG Health uses “a proprietary…media, tech and data engine [to] deliver personalized omnichannel experiences across touchpoints.” Its “comprehensive data stack [is] powered by Acxiom.” Ogilvy Health recently identified some of the key social media strategies used by pharmaceutical firms to generate consumer engagement with their brands—helping generate invaluable data. They include, for example, a “mobile-first creative and design approach,” including the use of “stickers, reels, filters, and subtitles” on Instagram and well as “A/B testing” on Facebook and the use of “influencers.” A broad range of consumer-data-collecting partners also operates in this market, providing information and marketing facilitation. Google, Meta, Salesforce, IQVIA, and Adobe are just a few of the companies integrated into health marketing services designed to “activate customer journeys (healthcare professionals and patients) across physical and digital channels [using] real-time, unified data.” Machine learning and AI are increasingly embedded in the health data surveillance market, helping to “transform sales and marketing outcomes,” for example. The use of social media, AI and machine learning, including for personalization, raises concerns that consent is insufficient alone for the release of patient and consumer health information. The commission should adopt its proposed rule, but also address the system-wide affordances of commercial surveillance to ensure health data is truly protected in terms of privacy and security. The commission should endorse a patient health record information definition that reflects both the range and type of data collected, but also the processes used to gather or generate it. The prompting and inducement of physicians, for example, to prescribe specific medications or treatments to a patient, based on the real-time “point-of-care” information transmitted through EHRs, ultimately generate identifiable information. So any interaction and iterative process used to do so should be covered under the rule, reflecting all the elements involved in that decision-making and treatment determinative process. By ensuring that all the entities involved in this system—including health care services or suppliers—must comply with data privacy and security rules, the commission will critically advance data protection in the health marketplace. This should include health apps, which increasingly play a key role in the commercial data-driven marketing complex. All partnering organizations involved in the sharing, delivering, creating and facilitation of health record information should also be held accountable. We applaud the FTC’s work in the health data privacy area, including its important GoodRx case and its highlighting the role that “dark patterns” play in “manipulating or deceiving consumers.” Far too much of the U.S. health data landscape operates as such a “dark pattern.” The commission’s proposed HBNR rules will illuminate this sector, and, in the process, help secure greater privacy and protection for Americans. -
CFPB Data Broker Filing - - U.S. Public Interest Research Group (PIRG) and Center for Digital Democracy (CDD)
In response to the Request for Information Regarding Data Brokers and Other Business Practices Involving the Collection and Sale of Consumer Information Docket No. CFPB-2023-0020
-
Consumer financial safeguards for online payments needed, says U.S. PIRG & CDDBig Tech Payment PlatformsSupplemental Comments of USPIRG and the Center for Digital DemocracyCFPB-2021-0017December 7, 2022United States Public Interest Research Group (USPIRG) and the Center for Digital Democracy (CDD) submit these additional comments to further inform the Bureau’s inquiry. They amplify the comments USPIRG and CDD submitted last year.[1] We believe that since we filed our original comment, the transformation of “Big Tech” operated digital payment platforms has significantly evolved, underscoring the need for the Bureau to institute much needed consumer protection safeguards. We had described how online platform based payment services seamlessly incorporate the key elements of “commerce” today—including content, promotion, marketing, sales and payment. We explained how these elements are part of the data-driven “surveillance” and personalized marketing system that operates as the central nervous system for nearly all U.S. online operations. We raised the growing role that “social media commerce” plays in contemporary payment platforms, supporting the Bureau’s examination of Big Tech platforms and consumer financial payment services. For example, U.S. retail social media commerce sales will generate $53 billion in 2022, rising to $107 billion by 2025, according to a recent report by Insider Intelligence/eMarketer. Younger Americans, so-called “Generation Z,” are helping drive this new market—an indicator of how changing consumer financial behaviors are being shaped by the business model and affordances of the Big Tech platforms, including TikTok, Meta and Google.[2]In order to meaningfully respond to the additional questions raised by the Bureau in its re-opening of the comment period, in particular regarding how the payment platforms handle “complaints, disputes and errors” and whether they are “sufficiently staffed…to address consumer protection and provide responsible customer service,” USPIRG and CDD offer some further analysis regarding the structural problems of contemporary platform payment systems below.[3]First, payment services such as operated by Google, Meta, TikTok and others have inherent conflicts of interest.They are, as the Bureau knows, primarily advertising systems, that are designed to capture the “engagement” of individuals and groups using a largely stealth array of online marketing applications (including, for example, extensive testing to identify ways to engage in subconscious “implicit” persuasion).[4] Our prior comment and those of other consumer groups have already documented the extensive use of data profiling, machine learning, cross-platform predictive analysis and “identity” capture that are just a few of current platform monetization tactics. The continually evolving set of tools available for digital platforms to target consumers has no limits—and raises critical questions when it comes to the financial security of US consumers. The build-out of Big Tech payment platforms leveraging their unique capabilities to seamlessly combine social media, entertainment, commerce with sophisticated data-driven contemporary surveillance has transformed traditional financial services concepts. Today’s social media giants are also global consumer financial banking and retail institutions. For example, J.P. Morgan has “built a real-time payments infrastructure” for TikTok’s parent company ByteDance: “that can be connected to local clearing systems. This allows users, content producers, and influencers to be paid instantaneously and directly into their bank accounts at any day or time. ByteDance has enabled this capability in the U.S. and Europe, meaning it covers approximately one-fifth of TikTok’s 1 billion active users worldwide.”[5]J.P. Morgan assisted ByteDance to also replace its “host-to host connectivity with banks, replacing it with application programming interfaces (API) connectivity that allows real-time exchange of data” between ByteDance and Morgan. This allows ByteDance to “track and trace the end-to-end status through the SWIFT network, see and monitor payments, and allow users to check for payments via their TikTok or other ByteDance apps in real time.” Morgan also has “elevated and further future-proofed ByteDance’s cash management through a centralized account structure covering all 15 businesses” through a “virtual account management and liquidity tool.”[6]Google’s Pay operations also illustrate how distinct digital payment platforms are from previous forms of financial services. Google explains to merchants that by integrating “with Google Wallet [they can] engage with users through location-based notifications, real-time updates” and offers, including encouraging consumers to “add offers from your webpage or app directly to Google wallet.” Google promotes the use of “geofenced notifications to drive engagement” with its Pay and Wallet services as well. Google’s ability to leverage its geolocation and other granular tracking and making that information available through a package of surveillance and engagement tools to merchants to drive financial transactions in real-time is beyond the ability of a consumer to effectively address. A further issue is the growing use of “personalization” technologies to make the financial services offering even more compelling. Google has already launched its “Spot” service to deliver “payment enabled experiences for users, including “fully customized experiences” in Google Pay. Although currently available only in India and Singapore, Google’s Spot platform, which allows consumers with “a few simple taps…to search, review, choose and pay” for a product is an example of how payment services online are continually advanced—and require independent review by consumer financial regulators. It also reflects another problem regarding protecting the financial well-being of US consumers. What are the impacts to financial security when there is no distance—no time to reflect—when the seamless, machine and socially-driven marketing and payment operations are at work?[7]A good example of the lack of meaningful protections for online financial consumers is Google Pay’s use of what’s known as “discovery,” a popular digital marketing concept meaning to give enhanced prominence to a product or service. Here’s how Google describes how that concept works in its Spot-enabled Pay application: “We understand that discovery is where it starts, but building deep connections is what matters the most - a connection that doesn’t just end with a payment, but extends to effective post sale engagement. The Spot Platform helps merchants own this relationship by providing a conversational framework, so that order updates, offers, and recommendations can easily be surfaced to the customer. This is powered by our Order API which is specialised to surface updates and relevant actions for users' purchases, and the Messaging API which can surface relevant messages post checkout to the user.”[8]Meta (Facebook), along with ad giant WPP, also relies on the growing use of “discovery” applications to promote sales. In a recent report, they explain that “digital loyalty is driven by seamless shopping experiences, convenience, easy discovery, consistent availability, positive community endorsement and personal connections.”[9] Since Google and other payment platforms have relationships with dozens of financial institutions, and also have an array of different requirements for vendors and developers, USPIRG and CDD are concerned that consumers are placed at a serious disadvantage when it comes to protecting their interests and also seeking redress for complaints. The chain of digital payment services relationships, including with partners that conduct their own powerful data driven marketing systems, requires Bureau review. For example, PayPal is a partner with Google Pay, while the PayPal Commerce Platform has Salesforce as one of many partners.[10]See also PIRG’s recent comments to the FTC, for an extensive discussion of retail media networks and data clean rooms:[11]“Clean rooms are data platforms that allow companies to share first party data with one another without giving the other party full access to the underlying, user-level data. This ability to set controls on who has access to granular information about consumers is the primary reason that data clean rooms are able to subvert current privacy regulations.” Another important issue for the Bureau is the ability of the Big Tech payment platforms to collect and analyze data in ways that allow it to identify unique ways to influence consumer spending behaviors. In a recent report, Chinese ecommerce platform Alibaba explained how such a system operates: “The strength of Alibaba’s platforms allows a birds-eye view of consumer preferences, which is combined with an ecosystem of tactical solutions, to enable merchants to engage directly and co-create with consumers and source suppliers to test, adapt, develop, and launch cutting-edge products…helps merchants identify new channels and strategies to tap into the Chinese market by using precise market analysis, real-time consumer insights, and product concept testing.”[12]Such financial insights are part of what digital payment and platform services provide. PayPal, for example, gathers data on consumers as part of their “shopping journey.” In one case study for travel, PayPal explained that its campaign for Expedia involved pulling “together data-driven destination insights, creative messaging and strategic placements throughout the travel shoppers’ journey.” This included a “social media integration that drove users to a campaign landing page” powered by “data to win.” This data included what is the growing use of what’s euphemistically called “first-party data” from consumers, where there has been alleged permission to use it to target an individual. Few consumers will ever review—or have the ability to influence—the PayPal engine that is designed for merchants to “shape [their] customer journey from acquisition to retention.” This includes applications that add “flexible payment options…right on product pages or through emails;” “relevant Pay Later offer to customers with dynamic messaging;’ ability to “increase average order value” through “proprietary payment methods;” or “propose rewards as a payment option to help inspire loyalty.”[13]The impact of data-driven fostered social commerce on promoting the use of consumer payments should be assessed. For example, Shopify’s “in-app shopping experience on TikTok” claims that the placement of its “shopping tabs” by vendors on posts, profiles and product catalogs unleashes “organic discovery.” This creates “a mini-storefront that links directly to their online store for check out.’’ A TikTok executive explains how the use of today’s digital payment services are distinct—“rooted in discovery, connection, and entertainment, creating unparalleled opportunities for brands to capture consumers’ attention…that drives [them] directly to the digital point of purchase.”[14] TikTok also has partnered with Stripe, helping it “become much more integrated with the world of payments and fintech.”[15]TikTok’s Stripe integrations enable “sellers to send fans directly from TikTok videos, ads, and shopping tabs on their profiles to products available in their existing Square Online (link is external)store, providing a streamlined shopping experience that retains the look and feel of their personal brand.”[16] The Square/TikTok payment alliance illustrates the role that data driven commercial surveillance marketing plays in payment operations, such as the use of the “TikTok pixel” and “advanced matching.”[17] In China, ByteDance’s payment services reflects its growing ability to leverage its mass customer data capture for social media driven marketing and financial services.[18]We urge the Bureau to examine TikTok’s data and marketing practices as it transfers U.S. user information to servers in the U.S., the so-called “Project Texas,” to identify how “sensitive” data may be part of its financial services offerings.[19]Apple’s payment services deserve further scrutiny as its reintroduces its role as a digital advertising network, leveraging its dominant position in the mobile and app markets.[20] PayPal recently announced that it will be “working with Apple to enhance offerings for PayPal and Venmo merchants and consumers.” Apple is also making its payment service available through additional vendors, including the giant Kroger grocery store chain stores in California.[21]Amazon announced in October 2022 that Venmo was now an official payment service, where users could, during checkout, “select “Select a payment method” and then “Add a Venmo account.” This will redirect them to the Venmo app, where they can complete the authentication. Users can also choose Venmo to be their default payment method for Amazon purchases on that screen.”[22] Amazon’s AWS partners with fintech provider Plaid, another example of far-reaching partnerships restructuring the consumer financial services market.[23]ConclusionUSPIRG and CDD hope that both our original comments and these additional comments help the Bureau to understand the impact of rapid changes in Big Tech’s payments network relationships and partnerships. We believe urgent CFPB action is needed to protect consumers from the threat of Big Tech’s continued efforts to breach the important wall separating banking and commerce and to ensure that all players in the financial marketplace follow all the rules. Please contact us with additional questions.Sincerely yours,Jeff Chester, Executive Director, Center for Digital DemocracyEdmund Mierzwinski, Senior Director, Federal Consumer Program, U.S. PIRG [[1] /comment/CFPB-2021-0017-0079[2] /what-s-behind-social-commerce-surge-5-charts[3] We also believe that the Bureau’s request for comments concerning potential abuse of terms of service and use of penalties merits discussion. We look forward to additional comments from others. [4] /business/en-US/blog/mediascience-study-brands-memorable-tiktok; see Google, Meta, TikTok as well: https://www.neuronsinc.com/cases[5] /content/dam/jpm/treasury-services/documents/case-study-bytedance.pdf[6] /content/dam/jpm/treasury-services/documents/case-study-bytedance.pdf[7] /about/business/checkout/(link is external); /pay/spot(link is external); /about/business/passes-and-rewards/[8] /pay/spot[9] /news/meta-publishes-new-report-on-the-importance-of-building-brand-loyalty-in-on/625603/[10] See, for example, the numerous bank partners of Google in the US alone: /wallet/answer/12168634?hl=en. Also: /payments/apis-secure/u/0/get_legal_document?ldo=0&ldt=buyertos&ldr=us; /wallet/retail; /wallet/retail/offers/resources/terms-of-service; /us/webapps/mpp/google-pay-paypal; /products/commerce-cloud/overview/?cc=dwdcmain[11] /wp-content/uploads/2022/11/PIRG-FTC-data-comment-no-petitions-Nov-2022.pdf[12] /article/how-merchants-can-use-consumer-insights-from-alibaba-to-power-product-development/482374[13] /us/brc/article/enterprise-solutions-expedia-case-study(link is external); /us/brc/article/enterprise-solutions-acquire-and-retain-customers[14] /scaling-social-commerce-shopify-introduces-new-in-app-shopping-experiences-on-tiktok#[15] /financial-services-finserv/tiktok-partners-fintech-firm-stripe-tips-payments[16] /us/en/press/square-x-tiktok[17] /help/us/en/article/7653-connect-square-online-with-tiktok(link is external); /help/article/data-sharing-tiktok-pixel-partners[18] /video/douyin-chinas-version-tiktok-charge-093000931.html; /2021/01/19/tiktok-owner-bytedance-launches-mobile-payments-in-china-.html[19] /a/202211/16/WS6374c81ea31049175432a1d8.html[20] /news/newsletters/2022-08-14/apple-aapl-set-to-expand-advertising-bringing-ads-to-maps-tv-and-books-apps-l6tdqqmg?sref=QDmhoVl8[21] /231198771/files/doc_financials/2022/q3/PYPL-Q3-22-Earnings-Release.pdf;/2022/11/08/ralphs-begins-accepting-apple-pay/[22] /2022/10/25/amazon-now-allows-customers-to-make-payments-through-venmo/[23] /blogs/apn/how-to-build-a-fintech-app-on-aws-using-the-plaid-api/pirg_cdd_cfpb_comments_7dec2022.pdfJeff Chester
-
Coalition of child advocacy, health, safety, privacy and consumer organization document how data-driven marketing undermines privacy and welfare of young peopleChildren and teenagers experience widespread commercial surveillance practices to collect data used to target them with marketing. Targeted and personalized advertising remains the dominant business model for digital media, with the marketing and advertising industry identifying children and teens as a prime target. Minors are relentlessly pursued while, simultaneously, they are spending more time online than ever before. Children’s lives are filled with surveillance, involving the collection of vast amounts of personal data of online users. This surveillance, informed by behavior science and maximized by evolving technologies, allows platforms and marketers to profile and manipulate children.The prevalence of surveillance advertising and targeted marketing aimed at minors is unfair in violation of Section 5. Specifically, data-driven marketing and targeted advertising causes substantial harm to children and teens by:violating their privacy;manipulating them into being interested in harmful products;undermining their autonomyperpetuating discrimination and bias;Additionally, the design choices tech companies use to optimize engagement and data collection in order to target marketing to minors further harm children and teens. These harms include undermining their physical and mental wellbeing and increasing the risk of problematic internet risk. These harms cannot reasonably be avoided by minors or their families, and there are no countervailing benefits to consumers or competition that outweigh these harms.Surveillance advertising is also deceptive to children, as defined by the Federal Trade Commission. The representations made about surveillance advertising by adtech companies, social media companies, apps, and games are likely to mislead minors and their parents and guardians. These misrepresentations and omissions are material. Many companies also mislead minors and their guardians by omission because they fail to disclose important information about their practices. These practices impact the choices of minors and their families every day as they use websites, apps, and services without an understanding of the complex system of data collection, retention, and sharing that is used to influence them online. We therefore urge the Commission to promulgate a rule that prohibits targeted marketing to children and teenagers.Groups filing the comment included: The Center for Digital Democracy, Fairplay, and #HalfTheStory, American Academy of Pediatrics, Becca Schmill Foundation, Berkeley Media Studies Group, Children and Screens: Institute of Digital Media and Child Development, Consumer Federation of America, Consumer Federation of California, CUNY Urban Food Policy Institute, Eating Disorders Coalition for Research, Policy & Action, Enough is Enough, LookUp.live, Lynn’s Warriors, National Eating Disorders Association, Parents Television and Media Council, ParentsTogether, Peace Educators Allied for Children Everywhere (P.E.A.C.E.), Public Citizen and UConn Rudd Center for Food Policy & Health FairPlay's executive director Josh Golin said: "Big Tech's commercial surveillance business model undermines young people's wellbeing and development. It causes kids and teens to spend excessive time online, and exposes them to harmful content and advertising targeted to their vulnerabilities. The FTC must adopt a series of safeguards to allow vulnerable youth to play, learn, and socialize online without being manipulated or harmed. Most importantly, the Commission should prohibit data-driven advertising and marketing to children and teens, and make clear that Silicon Valley profits cannot come at the expense of young people's wellbeing.”CDD's Jeff Chester underscored this saying: "Children and teens are key commercial targets of today’s data-driven surveillance complex. Their lives are tethered to a far-reaching system that is specifically designed to influence how they spend their time and money online, and uses artificial intelligence, virtual reality, geo-tracking, neuromarketing and more to do so. In addition to the loss of privacy, surveillance marketing threatens their well-being, health and safety. It’s time for the Federal Trade Commission to enact safeguards that protect young people. "[full filing attached]
-
FTC Commercial Surveillance Filing from CDD focuses on how pharma & other health marketers target consumers, patients, prescribers “Acute Myeloid Lymphoma,” “ADHD,” “Brain Cancer,” “High Cholesterol,” “Lung Cancer,” “Overweight,” “Pregnancy,” “Rheumatoid Arthritis,” “Stroke,” and “Thyroid Cancer.” These are just a handful of the digitally targetable medical condition “audience segments” available to surveillance advertisers While health and medical condition marketers—including pharmaceutical companies and drug store chains—may claim that such commercial data-driven marketing is “privacy-compliant,” in truth it reveals how vulnerable U.S. consumers are to having some of their most personal and sensitive data gathered, analyzed, and used for targeted digital advertising. It also represents how the latest tactics leveraging data to track and target the public—including “identity graphs,” artificial intelligence, surveilling-connected or smart TV devices, and a focus on so-called permission-based “first-party data”—are now broadly deployed by advertisers—including pharma and medical marketers. Behind the use of these serious medical condition “segments” is a far-reaching commercial surveillance complex including giant platforms, retailers, “Adtech” firms, data brokers, marketing and “experience” clouds, device manufacturers (e.g., streaming), neuromarketing and consumer research testing entities, “identity” curation specialists and advertisers...We submit as representative of today’s commercial surveillance complex the treatment of medical condition and health data. It incorporates many of the features that can answer the questions the commission seeks. There is widespread data gathering on individuals and communities, across their devices and applications; techniques to solicit information are intrusive, non-transparent, and out of meaningful scope for consumer control; these methods come at a cost to a person’s privacy and pocketbook, and potentially has significant consequences to their welfare. There are also societal impacts here, for the country’s public health infrastructure as well as with the expenditures the government must make to cover the costs for prescription drugs and other medical services...Health and pharma marketers have adopted the latest data-driven surveillance-marketing tactics—including targeting on all of a consumer’s devices (which today also includes streaming video delivered by Smart TVs); the integration of actual consumer purchase data for more robust targeting profiles; leveraging programmatic ad platforms; working with a myriad of data marketing partners; using machine learning to generate insights for granular consumer targeting; conducting robust measurement to help refine subsequent re-targeting; and taking advantage of new ways to identify and reach individuals—such as “Identity Graphs”— across devices. [complete filing for the FTC's Commercial Surveillance rulemaking attached]cddsurveillancehealthftc112122.pdfJeff Chester