Download as pdf or txt
Download as pdf or txt
You are on page 1of 26

The Committee for Justice (202) 270-7748

3033 Wilson Blvd., Ste. 729 committeeforjustice.org


Arlington, VA 22201 @CmteForJustice

FEDERAL TRADE COMMISSION


OFFICE OF THE SECRETARY
600 PENNSYLVANIA AVENUE, NW
SUITE CC-5610 (ANNEX B)
WASHINGTON, DC 20580

Re: Trade Regulation Rule on Commercial


Surveillance and Data Security ANPR, R111004
DOCKET ID: FTC-2022-17752
SUBMITTED: NOVEMBER 20, 2022

ASHLEY BAKER
DIRECTOR OF PUBLIC POLICY
THE COMMITTEE FOR JUSTICE

Chair Khan and Commissioners:

The Committee for Justice (CFJ) submits this comment in response to the Federal Trade
Commission’s (FTC) Advanced Notice of Proposed Rulemaking (ANPR), “Trade Regulation Rule
on Commercial Surveillance and Data Security” (R111004).

INTRODUCTION & INTEREST STATEMENT


Founded in 2002, the Committee for Justice is a nonprofit legal and policy organization that
promotes and educates the public and policymakers about the rule of law and the benefits of
constitutionally limited government. Consistent with this mission, CFJ advocates in Congress, the
courts, and the news media about a variety of law and technology issues, encompassing
administrative law and regulatory reform, free speech, data privacy, and antitrust law.

CFJ has actively advocated for digital privacy protections in Congress, the federal courts, and the
Supreme Court.1 Today, our focus is on proper administrative process and regulatory impacts on

1 See, e.g., amicus briefs filed in Carpenter v. United States (August 2017),
https://www.scribd.com/document/356288790/Amicus-Brief-Filed-in-Carpenter-v-United-States and
United States v. Kolsuz (March 2017), https://www.scribd.com/document/355249553/United-States-
vKolsuz-Amucis-Brief; Letter to Congress in support of the CLOUD Act (March 2018),
https://www.scribd.com/document/371541902/ClarifyingLawful-Overseas-Use-of-Data-CLOUD-Act-
of2018; Levey, C. and Baker, A. (Sept. 2018). Letter to the Senate Judiciary Committee on the
Nomination of Brett Kavanaugh. Retrieved from bit.ly/CFJ-Letter; Baker, A. (March 2019). Justice
Gorsuch, Carpenter, and the Fourth Amendment. The Federalist Society. [Video file]. Retrieved from

Page | 1
innovation and economic growth. We believe that the 95 questions asked by the Commission in
this ANPRM raise numerous major questions that should be left to Congress to address.

The questions presented for comment also indicate that the Commission intends to pursue
regulations that would threaten American innovation and the online ecosystem that has
transformed our daily lives in recent decades. While enforcement for violations of consumer data
privacy and data security are necessary, the “costs to consumers of a world with less information
collection, sharing, aggregation, and use, is not only a world of greater information scarcity, but
of less consumer welfare.”2

RESPONSES TO QUESTIONS
Question 12: How, if at all, should potential new trade regulation rules address harms
to different consumers across different sectors? . . . To what extent, if any, is a
comprehensive regulatory approach better than a sectoral one for any given harm?

The Commission should avoid considering a broad, one-size fits all rule for any firms that handle
data and should also avoid making rules that apply by sector.3 The Commission should consider
that smaller companies, especially start-ups, will be disproportionately affected by any regulations
because they do not have the same resources to devote to the inevitable compliance costs.4

Question 24: The Commission invites comment on the relative costs and benefits of
any current practice, as well as those for any responsive regulation. How should the
Commission engage in this balancing in the context of commercial surveillance and
data security? Which variables or outcomes should it consider in such an accounting?
Which variables or outcomes are salient but hard to quantify as a material cost or
benefit? How should the Commission ensure adequate weight is given to costs and
benefits that are hard to quantify?

We would like to offer the following general observations regarding privacy policy and the
economics of data collection rules:

http://bit.ly/2IpDNJf; Baker, A. (2019, March). New Technology, Same Principles: The Supreme Court and
Tech. American Action Forum. Retrieved from http://bit.ly/SCOTUStech.
2 Ashley Baker, RE: Hearings on Competition and Consumer Protection in the 21 st Century:

Consumer Privacy Pre-Hearing Comments, COMMITTEE FOR JUSTICE (Dec. 21, 2018),
https://www.scribd.com/document/396164780/Committee-for-Justice-Comments-to-the-FTC-on-
Competition-and-Consumer-Protection-in-the-21st-Century.
3 See, e.g., Ashley Baker, Privacy Policy and the Economics of Data Collection Rules, COMMITTEE

FOR JUSTICE (Jan 7, 2019) https://www.committeeforjustice.org/single-post/economics-of-privacy-policy


(noting that “[b]road, one-size-fits-all privacy rules would have negative consequences for every sector
that makes use of data and the ripple effect would be felt across the entire economy”).
4 See id. (“Unlike their resource-lean startup counterparts, large companies are far better situated

to devote labor costs and time to addressing the increased compliance costs necessitated by broad data
protection mandates.”).

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 2


I. Restrictions on data-driven marketing would harm consumers by causing the demise of
many of the online resources they rely on. In recent decades, consumers’ personal and
professional lives have been transformed for the better by a vast collection of data-driven
resources that are subsidized by advertising and made available at no cost. Policies must
strike a balance between realistic consumer privacy preferences and access to
information.5

Exhibit A: The data-driven marketing economy (DDME) is a significant growth sector.


The ability to collect and share data with third-parties has allowed businesses to grow.
Restrictions on the use of consumer data would stifle the economic growth created by
data-driven marketing.

Source: Data-Driven Marketing Institute

II. Broad, one-size-fits-all privacy rules would have negative consequences for every sector
that makes use of data and the ripple effect would be felt across the entire economy.6
These impacts are already being felt in Europe as a result of the EU’s implementation of
the General Data Protection Regulation (GDPR) in May.7 An earlier report commissioned
by the U.S. Chamber of Commerce argues that the negative impact on the EU GDP could

5 These resources are an engine of economic growth, even when other sectors experience

difficult economic times. Data-driven marketing is estimated to have added more than $200 billion to the
U.S. economy in 2014, a 35% increase over just two years earlier. (John Deighton and Peter Johnson,
“The Value of Data 2015: Consequences for Insight, Innovation and Efficiency in the U.S. Economy.”
Data & Marketing Association. Dec. 2015, https://thedma.org/wp-content/uploads/Value-of-Data-
Summary.pdf.)
6 Data minimization and purpose-limitation mandates make it far more difficult to transmit

information between firms, industries, and national borders. (See, e.g., Sarah Wheaton, "5 BIG Reasons
Europe Sucks at Curing Cancer," Politico, 12 Oct. 2018, https://www.politico.eu/article/cancer-5-big-
reasons-europe-sucks-at-curing/.). The GDPR, for example, would have made it impossible for the Danish
Cancer Society to conduct the study that helped dispel the myth of a correlation between mobile cellular
phone use and cancer. (See Patrizia Frei et al., "Use of Mobile Phones and Risk of Brain Tumours: Update
of Danish Cohort Study,” BMJ, 20 Oct. 2011,
https://www.cancer.dk/dyn/resources/File/file/9/1859/1385432841/1_bmj_2011_pdf.pdf.)
7 Writing at the American Enterprise Institute months after GDPR’s implementation, Daniel Lyons

notes the effects such a rule would be likely to have on diminished consumer welfare: "The chilling effect
on digital products available to European consumers could be significant. Even if companies are not
actively marketing to European residents, they may have European visitors interacting with their
webpage, taking advantage of marketing offers, or subscribing to newsletters. If these interactions result
in retention of personally identifiable information, the company is subject to the GDPR. The ease with
which a company may find itself bound, coupled with the cost of compliance and potentially draconian
penalties for violation, creates strong incentives for companies to withdraw -- aggressively -- from
European markets." (Daniel Lyons, “GDPR: Privacy as Europe’s tariff by other means?,” American
Enterprise Institute, 3 July 2018, https://www.aei.org/publication/gdpr-privacy-as-europes-tariff-by-other-
means/.)

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 3


reach -0.8% to -1.3%. The end result would be a direct negative welfare effect on four-
person households of about $1,353 per year.8

Exhibit B: Cross-border data flows have a multiplier effect on productivity and growth. To
maintain interoperability in light of new privacy regulations, the EU has identified
countries with "adequate" privacy protections. Combined, these "adequate" countries
represent less than 6% of global services trade.

Source: IMF Study cited by the U.S. Chamber of Commerce

III. Data protection policies such as “opt-in rules” could deter venture capital investments
and strangle U.S.-based tech start-ups. These effects are not merely hypothetical. For
example, following the implementation of the opt-in model mandated in the EU’s Privacy
and Electronic Communications Directive (2002/58/EC), online ads became 65 percent
less effective.9 This is also one of the reasons for the absence of tech startups in

8 The report continues, "EU services exports to the United States drop by -6.7% due to loss of
competitiveness. As goods exports are highly dependent on efficient provision of services (up to 30% of
manufacturing input values come from services), EU manufacturing exports to the United States could
decrease by up to -11%, depending on the industry. In such case, the export benefits produced by the
EU-U.S. FTA are eradicated by a good margin." (Matthia Bauer, et. al., “The Economic Importance of
Getting Data Protection Right: Protecting Privacy, Transmitting Data, Moving Commerce,” European
Centre for International Political Economy, report commissioned by the U.S. Chamber of Commerce, Mar.
2013, p. 3,
https://www.uschamber.com/sites/default/files/documents/files/020508_EconomicImportance_Final_Revis
ed_lr.pdf.)
9 Alan McQuinn. "The Economics of 'Opt-Out' Versus 'Opt-In' Privacy Rules." Information

Technology and Innovation Foundation. 6 Oct. 2017, https://itif.org/publications/2017/10/06/economics-


opt-out-versus-opt-in-privacy-rules.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 4


Europe.10 The inability to generate online revenue and to develop new products forms a
roadblock for venture capital investments. Opt-in policies are also illogical since the
knowledge that privacy settings can be changed acts as a form of affirmative consent.

IV. When faced with compliance and financial burdens, new technology companies—and the
tax revenue and job creation they produce—tend to move to favorable regulatory
environments. Since technology, by nature, cannot be confined within state borders, these
companies are more likely to choose to operate outside of the United States. Policymakers
should pay particular attention to proposed state regulations that threaten to strangle new
businesses and technologies with contradictory laws and enforcement.

V. Public debate is disproportionately focused on large companies, but the vast majority of
Internet companies fall in the latter category and include the very companies that might
otherwise grow to compete with and even supplant the tech giants of today. Sweeping ex
ante regulatory approaches like the GDPR, and the recently-passed California Consumer
Privacy Act (CCPA), are likely to create an artificial imbalance in the competitive
ecosystem in which many firms operate.11 Unlike their resource-lean startup counterparts,
large companies are far better situated to devote labor costs and time to addressing the
increased compliance costs necessitated by broad data protection mandates such as the
GDPR. This imbalance is likely to result in anticompetitive lock-in effects for incumbent
firms.

Exhibit C: Companies expected to face significant challenges with investments,


compliance, vendor relations, reporting, and budgeting after the implementation of the
GDPR. Notably, 9% identified "change or close operations in Europe" as an area
requiring significant effort.

10Mark Scott. "For Tech Start-Ups in Europe, an Oceanic Divide in Funding." The New York
Times. 19 January 2018. https://www.nytimes.com/2015/02/14/technology/for-tech-start-ups-in-europe-
an-oceanic-divide-in-funding.html.
11 A recent report from Columbia University explained, "As of 2017, Google and Facebook claim
seventy-seven cents of every dollar spent on digital advertising in the United States, with no other single
company claiming even as much as three percent of the total market share. While the GDPR may hinder
some of these companies’ data collection and/or sharing activities, the regulation may well squeeze smaller
advertising networks even more, potentially magnifying the dominance of this duopoly in online advertising.
These smaller ad networks, for example, typically lack the direct consumer relationships needed to secure
consent from users on their own behalf, but may also find that media publishers and other website hosts
are reluctant to ask for user consent for the broad range and volume of data that these advertisers can
presently access without hindrance. Without access to the data on which they currently rely, smaller
advertising networks may be simply cut out of the online market altogether unless they can find a way to
gain some advantage over the platforms in compliance, user-friendliness, or rates. In this environment,
platform companies and website hosts—such as media companies—that have a brand-name relationship
to their users are likely to have more success in persuading individuals to give up their information, and
therefore may have increased power in the advertising market under the GDPR." (Susan E. McGregor and
Hugo Zylberberg, “Understanding the General Data Protection Regulation: A Primer for Global Publishers,”
Tow Center for Digital Journalism at Columbia University (New York, NY: Mar. 2018), pp. 37-38,
https://doi.org/10.7916/D8K08GVB.)

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 5


Source: McDermott Will & Emery LLP and Ponemon Institute LLC

VI. Public opinion polls showing support for stronger data protections are misleading because
they rarely confront consumers with the monetary and other costs of their choices.12 A
2016 study found that, despite most participants’ unease with an email provider using
automated content analysis to provide more targeted advertisements, 65 percent of them
were unwilling to pay providers any amount for a privacy-protecting alternative.13 Such
studies remind us that most consumers do not value data privacy enough to pay anything
for it.

12 Privacy is just one component of a more complex bundle of values -- the contextual nature of
consumer valuations that occur in the moment and are subject to change on a whim -- and a general lack
of consensus regarding what constitutes the “boundaries” of ownership over individual data. As summarized
by Alec Stapp and Ryan Hagemann in comments to the NTIA, "Most research and behavioral studies
conclude that privacy is highly context-dependent. Privacy valuations are subject to cognitive biases,
including social desirability bias (e.g., people are less likely to share embarrassing information) and the
endowment effect. Most people care a great deal about privacy harms that result in material and financial
costs, such as identity theft, or the revelation of sensitive personal information to their close social circles.
They tend to care far less about data collected about their purchasing patterns and website browsing activity
by companies storing that information on distant, largely-inaccessible data server farms. This is especially
true when consumers receive what they judge to be considerable benefits at a functional cost to them of
zero dollars and zero cents." (Alec Stapp and Ryan Hagemann, Comments submitted to the National
Telecommunications and Information Administration in the Matter of: Request for Comments on Developing
the Administration’s Approach to Consumer Privacy, Docket Number 180821780-8780-01, submitted 8
Nov. 2018, p. 10, https://www.ntia.doc.gov/files/ntia/publications/niskanen_center.pdf.)

13 Lior Jacob Strahilevitz and Matthew B. Kugler. “Is Privacy Policy Language Irrelevant to Consumers?"
The Journal of Legal Studies 45, no. S2. Sept. 9, 2016.
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2838449.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 6


Exhibit D: For example, although American consumers have increasingly expressed
concerns over data collection, a majority of financial customers would give up data for
more banking benefits, while only a tiny percentage would pay for services designed to
protect their privacy. Price consequences and other incentives outweigh privacy when
factored into consumer decision-making.

Source: American Banker

VII. The Internet has proven useful and valuable in ways that were difficult to imagine over a
decade and a half ago, and it has created privacy challenges that were equally difficult to
predict. Legislative initiatives in the mid-1990s to heavily regulate the Internet in the name
of privacy would have impeded its growth while also failing to address the complex privacy
issues that arose years later.14 As the Congress and regulatory agencies continue to
consider the issue of consumer privacy in the digital age, it would do well to embrace a
policy of restraint and forbearance.

VIII. In short, many of the recent privacy proposals wouldn’t necessarily protect consumers,
but would make America more like Europe. The United States’ economic growth and
status as a global leader in innovation will depend on a thorough evaluation of risks when
crafting our nation’s approach to consumer privacy.15 As calls for data privacy in the United
States echo those heard in Europe, it is important to remember the fate of the European
Union’s digital economy at the hands of a strict regulatory regime.16 We should learn from
their mistakes.

14 In a 2014 Foreign Affairs essay, Craig Mundie considers what might have become of the digital
economy “if, in 1995, comprehensive legislation to protect Internet privacy had been enacted.” Such
policies, Mundie concludes, would have utterly failed to anticipate the complexities that arose after the turn
of the century with the growth of social networking and location-based wireless services. (Craig Mundie,
“Privacy Pragmatism,” Foreign Affairs, Vol. 93, No. 2 (March/April 2014), p. 517,
http://www.foreignaffairs.com/articles/140741/craig-mundie/privacy-pragmatism.)
15 Technological advancement and global leadership in innovation is largely driven by startups.

Restrictive regulations are the primary reason for the dearth of tech startups in Europe, as the inability to
generate online revenue and to develop new products forms a roadblock for venture capital investments.
(Mark Scott. "For Tech Start-Ups in Europe, an Oceanic Divide in Funding." The New York Times. 19
January 2018. https://www.nytimes.com/2015/02/14/technology/for-tech-start-ups-in-europe-an-oceanic-
divide-in-funding.html.)
16 For example, following the implementation of the opt-in model mandated in the EU’s Privacy and

Electronic Communications Directive (2002/58/EC), online ads became 65 percent less effective. (Alan

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 7


Questions 26: To what extent would any given new trade regulation rule on data
security or commercial surveillance impede or enhance innovation? To what extent
would such rules enhance or impede the development of certain kinds of products,
services, and applications over others?

See response to Question 24.

Question 27: Would any given new trade regulation rule on data security or commercial
surveillance impede or enhance competition? Would any given rule entrench the
potential dominance of one company or set of companies in ways that impede
competition? If so, how and to what extent?

Strict data protection rules could diminish consumer welfare and serve to entrench large
technology companies. Writing at the American Enterprise Institute several months after GDPR’s
implementation, Daniel Lyons notes the effects such a rule is likely to have:

“The chilling effect on digital products available to European consumers could be


significant. Even if companies are not actively marketing to European residents, they may
have European visitors interacting with their webpage, taking advantage of marketing
offers, or subscribing to newsletters. If these interactions result in retention of personally
identifiable information, the company is subject to the GDPR. The ease with which a
company may find itself bound, coupled with the cost of compliance and potentially
draconian penalties for violation, creates strong incentives for companies to withdraw --
aggressively -- from European markets.”17

A 2013 report commissioned by the U.S. Chamber of Commerce notes similar impacts that would
be felt by consumers as a result of potential trade disruptions to cross-border data flows stemming
from the GDPR. The report argues that the negative impact on EU gross domestic product (GDP)
could reach 0.8% to -1.3%.”18 It continues:

“EU services exports to the United States drop by -6.7% due to loss of competitiveness.
As goods exports are highly dependent on efficient provision of services (up to 30% of
manufacturing input values come from services), EU manufacturing exports to the United

McQuinn. "The Economics of 'Opt-Out' Versus 'Opt-In' Privacy Rules." Information Technology and
Innovation Foundation. 6 Oct. 2017, https://itif.org/publications/2017/10/06/economics-opt-out-versus-opt-
in-privacy-rules.)
17 Daniel Lyons, “GDPR: Privacy as Europe’s tariff by other means?,” American Enterprise

Institute, 3 July 2018, https://www.aei.org/publication/gdpr-privacy-as-europes-tariff-by-other-means/.


18 Matthia Bauer, et. al., “The Economic Importance of Getting Data Protection Right: Protecting

Privacy, Transmitting Data, Moving Commerce,” European Centre for International Political Economy,
report commissioned by the U.S. Chamber of Commerce, Mar. 2013, p. 3,
https://www.uschamber.com/sites/default/files/documents/files/020508_EconomicImportance_Final_Revis
ed_lr.pdf.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 8


States could decrease by up to -11%, depending on the industry. In such case, the export
benefits produced by the EU-U.S. FTA are eradicated by a good margin.”19

The end result would be a direct negative welfare effect on four-person households of about
$1,353 per year.20

Large companies can typically survive these decreases in revenue and increased compliance
costs, while smaller companies may no longer be able to operate. Public debate is
disproportionally focused on large companies, but the vast majority of Internet companies fall in
the latter category and include the very companies that might otherwise grow to compete with
and even supplant the other tech giants of today.

Accordingly, sweeping ex ante regulatory approaches like the GDPR and the California Consumer
Privacy Act are also likely to create an artificial imbalance in the competitive ecosystem in which
many firms operate. This imbalance is likely to result in anticompetitive lock-in effects for
incumbent firms. Unlike their resource-lean startup counterparts, large companies are far better
situated to devote labor costs and time to address the increased compliance costs necessitated
by broad data protection mandates like the GDPR.

In other words, as privacy-based compliance costs in a given digital market increase, the level of
new entrants to, and competition among firms within, that market is likely to decline:

“Smaller ad networks typically lack the direct consumer relationships needed to secure
consent from users on their own behalf, but may also find that media publishers and other
website hosts are reluctant to ask for user consent for the broad range and volume of data
that these advertisers can presently access without hindrance. Without access to the data
on which they currently rely, smaller advertising networks may be simply cut out of the
online market altogether unless they can find a way to gain some advantage over the
platforms in compliance, user-friendliness, or rates.”21

Arguably, data privacy regulations are more likely to create a duopoly than anticompetitive
practices, or the mere size of the companies we have today. The ad tech industry is still out of
compliance with GDPR. Importing Europe’s data protection laws would pose a formidable threat
to the entire industry.

Question 29: What are the benefits or costs of refraining from promulgating new rules
on commercial surveillance or data security?

See response to Question 24.

19 Id.
20 Id.
21 Susan E. McGregor and Hugo Zylberberg, “Understanding the General Data Protection

Regulation: A Primer for Global Publishers,” Tow Center for Digital Journalism at Columbia University
(New York, NY: Mar. 2018), pp. 37-38, https://doi.org/10.7916/D8K08GVB.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 9


Question 31: Should the Commission commence a Section 18 rulemaking on data
security? The Commission specifically seeks comment on how potential new trade
regulation rules could require or help incentivize reasonable data security.

First and foremost, any and all contemplated regulatory action by the Federal Trade Commission
(“FTC” or “Commission”) in the data privacy space should be put on hold until Congress has a
chance to fully consider the American Data Privacy and Protection Act.22 FTC regulations will not
have the same impact as a national privacy law, which all of the Commissioners recognize.23
Congressional action is the preferable way to handle the topic of data privacy because only a
federal statute can properly address preempting state laws and creating private rights of action.

This attempt to kickstart addressing data privacy and surveillance concerns through the FTC’s
unfair and deceptive acts and practices (“UDAP”) authority could negatively impact the willingness
of members of Congress to pass legislation. With a bipartisan privacy bill currently under
consideration, the Commission would be well advised to discontinue any attempts at rulemaking
and stick to its primary role as an enforcer of laws already in place. There is always time after a
failed bill to begin a rulemaking process.

If the ADPPA does not pass, then the most appropriate way for the FTC to proceed in this field is
to commence a Section 18 rulemaking. However, to craft an effective rule, the Commission must
pare down the areas that it intends to address if it hopes to have any meaningful feedback from
the public. The first step in this rulemaking process is getting input from the public that will help
the Commission prove the three requirements for the statement of basis and purpose that it must
include in the NPRM–namely, whether the specific act or practice is prevalent, how that act or
practice is unfair or deceptive, and what is the economic effect of the rule.24

Prevalence requires that the Commission find a particular act is widespread in the business
community. Once an act or practice is found to be prevalent, the Commission can then properly
target its questions to the public to create a meaningful rulemaking record to determine whether
the act is unfair or deceptive. That meaningful feedback will shape an eventual notice of proposed
rulemaking, where the Commission will need to show the economic effects of the proposed rules.

Process Concerns

22 H.R. 8152.
23 87 Fed. Reg. 51286 (August 11, 2022) (Statement of Chair Lina M. Khan) (noting that she
hopes “Congress passes strong federal privacy legislation”); id. at 51288 (Statement of Comm’r Rebecca
Kelly Slaughter) (“I prefer Congressional action to strengthen our authority.”); id. at 51292 (Statement of
Comm’r Alvaro M. Bedoya) (“[The ADPPA] is the strongest privacy bill that has ever been this close to
passing. I hope it does pass. I hope it passes soon.”); id. at 51293 (Dissenting Statement of Comm’r
Noah Joshua Phillips) (”Congress—not [the FTC]—is where national privacy law should be enacted.”); id.
at 51298 (Dissenting Statement of Comm’r Christine S. Wilson) (“Federal privacy legislation would
provide transparency to consumers . . . would give businesses much-needed clarity and certainty
regarding the rules of the road in this important area . . . [and] help curb violations of our civil liberties.”).
24 See, e.g., J. Howard Beales & Timothy J. Muris, Back to the Future: How Not to Write a Regulation,

AMERICAN ENTERPRISE INSTITUTE (May 26, 2022).

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 1 0


The three Democrat Commissioners state that an ANPRM is only the beginning of the process
and indicate that there will be “much more process” to keep the rulemaking in line. 25 This,
however, is not true. The next publication by the Commission on this matter will be an NPRM,
which is supposed to consider these requested public comments as well as include the actual
proposed text of the rulemaking. Once the Commission publishes the NPRM, parties will have
some recourse to the recently streamlined26 informal hearings process27 to address concerns with
the FTC’s proposed rules.

So, if the next part of the process on the Commission’s side is to publish the proposed language
of a rule, where exactly is there room for more process? The FTC has inundated the public with
95 questions addressing a range of critical issues—none of which actually address data privacy
despite all mentioning the lack of federal privacy legislation as a motivating factor for the
ANPRM—and this allows the FTC to propose any rules that are the “logical outgrowth” of those
questions.28

As Commissioner Phillips pointed out in his dissent, “the ANPR does not identify the full scope of
approaches it could undertake, does not delineate a boundary on issues on which the public can
comment, and in no way constrains the actions it might take in an NPRM or final rule.” With such
a large set of open-ended questions, anything the Commission publishes in its NPRM will meet
the weak requirements of being a “logical outgrowth” of the ANPRM. This is intentional and done
to cut off one method of challenging the lack of process involved in the rulemaking.

If the Commission wishes to pursue a rulemaking on commercial surveillance practices, it need


only look to its solid enforcement record29 under its consumer protection authority to craft an
effective rule. Most of the Commission’s successful cases involve the unauthorized use of
sensitive information by companies without the consumers’ knowledge or consent.30 As
evidenced by the recent suit brought against Kochava,31 the FTC argues that the ease of access

25 See supra n.23 at 51292 (Statement of Comm’r Alvaro Bedoya).


26 FTC Votes to Update Rulemaking Procedures, Sets Stage for Stronger Deterrence of
Corporate Misconduct, Fed. Trade Comm’n (July 1, 2021) https://www.ftc.gov/news-events/news/press-
releases/2021/07/ftc-votes-update-rulemaking-procedures-sets-stage-stronger-deterrence-corporate-
misconduct.
27 See, e.g., 16 C.F.R. § 0.8 (establishing that the Chair now designates the Chief Presiding

Officer); 16 C.F.R. § 1.11(d) (allowing the Commission to set the time for written comments and rebuttal
periods at its discretion rather than by a standard length of time).
28 See, e.g., Long Island Care at Home, Ltd. v. Coke, 551 U.S. 158, 174 (2007) (discussing the

logical outgrowth test for notice and comment rulemaking and noting that “[t]he object, in short, is one of
fair notice”).
29 See Privacy and Security Enforcement, FED. TRADE COMM’N https://www.ftc.gov/news-

events/topics/protecting-consumer-privacy-security/privacy-security-enforcement (compiling the FTC’s


enforcement record and advisory opinions).
30 See, e.g., FTC Report to Congress on Privacy and Security, FED. TRADE COMM’N (Sept. 13,

2021), https://www.ftc.gov/system/files/documents/reports/ftc-report-congress-privacy-
security/report_to_congress_on_privacy_and_data_security_2021.pdf (“In the last year, much of the
FTC’s privacy and data security work has dealt with themes that the pandemic has brought to the
forefront, such as increased use of health apps, accuracy of data used for housing, employment, and
credit, and videoconferencing and ed tech.”).
31 See FTC Sues Kochava for Selling Data that Tracks People at Reproductive Health Clinics,

Places of Worship, and Other Sensitive Locations, FED. TRADE COMM’N (Aug. 29, 2022)

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 1 1


to compiled geolocation data–even if not individually considered sensitive–can be considered a
consumer harm. This case-by-case approach to establishing proper data security practices
through enforcement actions is preferable to rules that may thwart innovation.

Question 35. Should the Commission take into account other laws at the state and
federal level (e.g., COPPA) that already include data security requirements. If so, how?
Should the Commission take into account other governments’ requirements as to data
security (e.g., GDPR). If so, how?

The Commission should absolutely take into account state and federal laws when considering a
rule. In fact, the best place to start for this rulemaking is to rely on the federal statutes that the
Commission already administers, like the Children’s Online Privacy Protection Act (“COPPA”).32
A consistent theme in the recent public forum33 was the negative impact on students, particularly
minors, as a result of the required sharing of personal information to use educational technology
and the subsequent profiling of students that results from the use of school-mandated software.
Implementing data security protections in that area could be accomplished through relying on
COPPA and FERPA in addition to UDAP. And with a recent win for college student privacy in a
district court in the fifth circuit,34 such a rule would likely withstand judicial review.

On the other hand, the FTC should not give much weight to how other countries approach data
security requirements when crafting a proposed data security rule, and indeed should try to make
sure we do not follow a similar path as that in Europe.35 The rules in places like the European
Union are predicated on an entirely different model of regulation, which treats dominant
companies as public utilities and actively engages in picking and choosing winners in markets.
The United States’ system does not envision the government making such decisions about who
wins in competition and thus should not implement regulations that effectively penalize parties for
operating in the United States with high regulatory compliance costs. Considering the negative

https://www.ftc.gov/news-events/news/press-releases/2022/08/ftc-sues-kochava-selling-data-tracks-
people-reproductive-health-clinics-places-worship-other.
32 15 U.S.C. § 6501–6505 (establishing authority of the Commission to set regulations for safe

use of the internet for children under age 13).


33 See Commercial Surveillance and Data Security Public Forum (Sept. 8, 2022)

https://kvgo.com/ftc/commercial-surveillance-sep-8.
34 See Emma Bowman, Scanning students' rooms during remote tests is unconstitutional, judge

rules, NPR (Aug. 26. 2022, 3:11 PM) https://www.npr.org/2022/08/25/1119337956/test-proctoring-room-


scans-unconstitutional-cleveland-state-university (discussing Ogletree v. Cleveland State Univ., 2022
U.S. Dist. LEXIS 150513; 2022 WL 3581569 (Aug. 22, 2022)).
35 See, e.g., Ashley Baker, CFJ Letter to Senate Judiciary: When regulating data privacy, don’t

make America more like Europe, COMMITTEE FOR JUSTICE (Mar. 18, 2019),
https://www.committeeforjustice.org/single-post/letter-to-senate-judiciary-on-gdpr-ccpa-opt-ins-consumer-
control-and-the-impact-on-competition-and-i (noting the negative impact of the GDPR on European
business and pointing out that “following the implementation of the opt-in model mandated in the EU’s
Privacy and Electronic Communications Directive (2002/58/EC), online ads became 65 percent less
effective. This is also one of the reasons for the absence of tech startups in Europe. The inability to
generate online revenue and to develop new products forms a roadblock for venture capital
investments”).

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 1 2


effects on the European economy from the GDPR,36 the Commission should avoid repeating such
mistakes.

Additionally, the Commission must consider how any rules about data security can be exploited
by other countries, particularly those with state-backed companies, to gain an advantage over
American companies.37

Question 76: To what extent should new trade regulation rules prohibit certain specific
commercial surveillance practices, irrespective of whether consumers consent to
them?

An attempt to ban certain commercial surveillance practices regardless of consent comes awfully
close to embodying a wayfaring fool38 standard that got the FTC labeled a “national nanny” in the
past. A better approach for the Commission is to consider the effect of requiring companies to
implement clear requests for affirmative consent for certain tracking activities and disclosure of
how the collected data will be used. It would be a needless burden on companies to ban practices
regardless of consent simply because some companies employ a practice in inappropriate ways,
especially as the identified methods of tracking may quickly become obsolete for any alleged
unfair or deceptive reasons.

Question 79: Should the Commission require different consent standards for different
consumer groups . . . ?

If the Commission intends to create different consent standards for different consumer groups, it
will need to rely on other statutory authority for making such distinctions. For example, the
Commission could rely on COPPA39 to set different consent standards for children under age 13,
or it could rely on the Gramm-Leach-Bliley Act40 to treat consumers’ financial information
differently.

36 See, e.g., Ashley Baker, Privacy Policy and the Economics of Data Collection Rules,
COMMITTEE FOR JUSTICE (Jan 7, 2019) (“An earlier report commissioned by the U.S. Chamber of
Commerce argues that the negative impact on the EU GDP could reach -0.8% to -1.3%. The end result
would be a direct negative welfare effect on four-person households of about $1,353 per year.”).
37 See, e.g., Robert C. O’Brien, Breaking Up Big Tech is a Gift to China, WALL STREET JOURNAL

(Dec. 26, 2021) https://www.wsj.com/articles/congress-breaking-up-silicon-valley-tech-is-a-gift-to-china-


tencent-baidu-bytedance-quantum-11640525284 (“President Xi Jinping has stated his intention to spend
$1.4 trillion by 2025 to surpass the U.S. in key technology areas, and the Chinese government
aggressively subsidizes national champion firms. . . . Beijing has made clear that it won’t stop until it
dominates technologies such as quantum computing, artificial intelligence, autonomous systems and
more. Last month the National Counterintelligence and Security Center warned that these are
technologies ‘where the stakes are potentially greatest for U.S. economic and national security.’”).
38 See, e.g., J. Howard Beales III, What State Regulators Should Learn from FTC Experience in

Regulating Advertising, 10 J. PUB. POL’Y & MARKETING 101, 103 (1991) (describing that consumer welfare
is negatively impacted when “[a]dvertising that conveys truthful and useful information to the
overwhelming majority of consumers…is prohibited because ‘wayfaring men, though fools’ might
misinterpret it in a way that is misleading[.]”).
39 15 U.S.C. § 6501–6505.
40 Pub. L. 106–102 (Nov. 12, 1999).

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 1 3


Beyond relying on specific statutes in which Congress authorized the FTC to treat certain groups
differently, creating different regulatory burdens for businesses depending upon which consumer
groups they cater to would be inadvisable. Such a course of action would disproportionately
burden companies based on their audience and could lead to lesser quality content for consumers
in areas where stricter regulations are in place.41 Additionally, online companies’ audiences can
change over time, which would subject companies to confusing regulatory burdens that force
them to adopt the most restrictive regulatory standards to avoid liability.

Question 94: How should the FTC’s authority to implement remedies under the Act
determine the form or substance of any potential new trade regulation rules on
commercial surveillance? Should new rules enumerate specific forms of relief or
damages that are not explicit in the FTC Act but that are within the Commission’s
authority? . . . Is there a limit to the Commission’s authority to implement remedies by
regulation?

The Commission is certainly limited in its authority to implement remedies by regulation. Agencies
only have as much authority as is conveyed by their organic statutes and other statutes that
Congress tasks them with administering.42 All authority must be positively identified–there is
nothing implied or residual for such unaccountable bodies in our constitutional system.

When operating solely under its UDAP authority (i.e., in the absence of additional statutory
authority), the Commission has very limited remedial authority. In fact, the Commission’s authority
is solely to issue cease-and-desist orders and to punish parties for subsequently violating those
orders. This limited remedial authority compensates for the broad discretionary authority to make
an initial determination of what acts fall under UDAP.

Our current system allows the FTC to determine that an act or practice is unfair or deceptive, then
bring an administrative proceeding against a party that performed that act, which is reviewable by
the same individuals who initiated the proceeding. This is already broad authority. Imagine a
scenario where that same agency also had the authority to structure the types of remedies (e.g.,
disgorgement, restitution, or even punitive damages) that should be applied to address the
violation. There is an excellent separation of powers reason for why the Commission cannot, and
should not, structure remedies and why Congress needs to be setting these standards.

41 See, e.g., Ashley Baker, Comments Regarding the FTC’s Implementation of COPPA,

COMMITTEE FOR JUSTICE (Dec. 31, 2019) (“[B]ecause sites dedicated to children’s programming are
subject to a higher regulatory burden than sites developed for the general public,” “COPPA tends to
restrict the variety and availability of child-oriented content.”).
42 See, e.g., La. Pub. Svc. Comm’n v. FCC, 476 U.S. 355, 374 (1986) (“[A]n agency literally has

no power to act, let alone preempt the validly enacted legislation of a sovereign State, unless and until
Congress confers power upon it.”).

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 1 4


APPENDIX A: RELEVANT COMMENTS FILED WITH THE FEDERAL
TRADE COMMISSION IN 2018

RE: Hearings on Competition and Consumer


Protection in the 21st Century: Consumer Privacy Pre-
Hearing Comments
DOCKET ID: FTC-2018-0098
SUBMITTED: DECEMBER 21, 2018

ASHLEY BAKER
DIRECTOR OF PUBLIC POLICY
THE COMMITTEE FOR JUSTICE

INTRODUCTION
Founded in 2002, the Committee for Justice (CFJ) is a nonprofit legal and policy organization that
promotes and educates the public and policymakers about the rule of law and the benefits of
constitutionally limited government. Consistent with this mission, CFJ advocates in Congress, the courts,
and the news media about a variety of law and technology issues, encompassing administrative law and
regulatory reform, online free speech, antitrust law, and data privacy.

Additionally, CFJ has a long history of leadership on the issue of federal judicial nominations and the
confirmation process in the Senate. We have focused our attention on issues at the intersection of law
and technology by highlighting how those issues will be impacted. For example, CFJ submitted a letter to
the Senate Judiciary Committee explaining why the confirmation of Supreme Court Justice Brett
Kavanaugh would be good for technological innovation and the economic growth it spurs. 43

In recent years, CFJ has actively advocated for digital privacy protections in Congress, the federal courts,
and the Supreme Court.44 Today, our focus is on innovation, free speech, and economic growth. We
believe that restrictive new requirements or penalties for data collection and use are not only unwarranted
but would also threaten the online ecosystem that has transformed our daily lives in the last few decades.

Last month, CFJ responded to the National Telecommunications and Information Administration’s (NTIA)
Request for Comments on Developing the Administration’s Approach to Consumer Privacy (available as

43Curt Levey and Ashley Baker, Letter to the Senate Judiciary Committee on the Nomination of Brett Kavanaugh to
the Supreme Court. 4 Sept. 2018, https://www.committeeforjustice.org/single-post/Letter-for-the-Record-on-the-
Nomination-of-Brett-Kavanaugh-to-the-Supreme-Court.
44 See, e.g, amicus briefs filed in Carpenter v. United States. 11 Aug. 2017,
https://www.scribd.com/document/356288790/Amicus-Brief-Filed-in-Carpenter-v-United-States and United States v.
Kolsuz. 20 March 2017, https://www.scribd.com/document/355249553/United-States-v-Kolsuz-Amucis-Brief; The
Committee for Justice, Letter to Congress in Support of the Clarifying Lawful Use of Overseas Data (CLOUD) Act. 13
Feb. 2018, https://www.committeeforjustice.org/single-post/support-clarifying-lawful-use-data.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 1 5


an appendix).45 Similarly, these comments emphasize the need to prioritize economic prosperity and
preserve the United States' role as leader in technological innovation by learning from the disastrous
results of privacy regulations abroad.46

RESPONSES TO QUESTIONS
What are the actual and potential benefits for consumers and to competition of information
collection, sharing, aggregation, and use? To what extent do consumers today, or are consumers
likely to, realize these benefits?

The costs to consumers of a world with less information collection, sharing, aggregation, and use, is not
only a world of greater information scarcity, but of less consumer welfare. These impacts are already
being felt in Europe as a result of the European Union’s (EU) implementation of the General Data
Protection Regulation (GDPR) this past May. Writing at the American Enterprise Institute some months
after GDPR’s implementation, Daniel Lyons notes the effects such a rule would be likely to have on
diminished consumer welfare:

The chilling effect on digital products available to European consumers could be significant. Even
if companies are not actively marketing to European residents, they may have European visitors
interacting with their webpage, taking advantage of marketing offers, or subscribing to
newsletters. If these interactions result in retention of personally identifiable information, the
company is subject to the GDPR. The ease with which a company may find itself bound, coupled
with the cost of compliance and potentially draconian penalties for violation, creates strong
incentives for companies to withdraw -- aggressively -- from European markets.47

A 2013 report commissioned by the U.S. Chamber of Commerce notes similar impacts that would be felt
by consumers as a result of potential trade disruptions to cross-border data flows stemming from the
GDPR. The report argues that the negative impact on EU gross domestic product (GDP) could reach
0.8% to -1.3%.”48 It continues:

EU services exports to the United States drop by -6.7% due to loss of competitiveness. As goods
exports are highly dependent on efficient provision of services (up to 30% of manufacturing input

45Ashley Baker, Comments submitted to the National Telecommunications and Information Administration in the
Matter of: Request for Comments on Developing the Administration’s Approach to Consumer Privacy, Docket
Number 180821780-8780-01. 9 Nov. 2018. https://www.scribd.com/document/393092584/Committee-for-Justice-
Comments-to-the-NTIA-on-Developing-the-Administration-s-Approach-to-Consumer-Privacy#from_embed. (“Many of
the recent privacy proposals wouldn’t protect consumers, but would make America more like Europe. The United
States’ economic growth and status as a global leader in innovation will depend on a thorough evaluation of risks
when crafting our nation’s approach to consumer privacy. As calls for data privacy in the United States echo those
heard in Europe, it is important to remember the fate of the European Union’s digital economy at the hands of a strict
regulatory regime. We should learn from their mistakes.”)
46Ashley Baker, “CFJ Files Comments with NTIA on Developing the Administration’s Approach to Consumer
Privacy,” 13 Nov. 2018. https://www.committeeforjustice.org/single-post/Developing-the-
Administration%E2%80%99s-Approach-to-Consumer-Privacy.
47 Daniel Lyons, “GDPR: Privacy as Europe’s tariff by other means?,” American Enterprise Institute, 3 July 2018,
https://www.aei.org/publication/gdpr-privacy-as-europes-tariff-by-other-means/.
48 Matthia Bauer, et. al., “The Economic Importance of Getting Data Protection Right: Protecting Privacy, Transmitting
Data, Moving Commerce,” European Centre for International Political Economy, report commissioned by the U.S.
Chamber of Commerce, Mar. 2013, p. 3,
https://www.uschamber.com/sites/default/files/documents/files/020508_EconomicImportance_Final_Revised_lr.pdf.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 1 6


values come from services), EU manufacturing exports to the United States could decrease by up
to -11%, depending on the industry. In such case, the export benefits produced by the EU-U.S.
FTA are eradicated by a good margin.49

The end result would be a direct negative welfare effect on four-person households of about $1,353 per
year.50 As these examples show, government-imposed restrictions on data collection would undercut
economic growth, the vibrancy of the online ecosystem, and consumer satisfaction.51

In recent decades, consumers’ personal and professional lives have been transformed for the better by a
vast collection of data-driven online resources that are subsidized by advertising and made available at
no cost. These resources are an engine of economic growth, even when other sectors experience difficult
economic times. Data-driven marketing is estimated to have added more than $200 billion to the U.S.
economy in 2014, a 35% increase over just two years earlier. 52

Restrictions on such marketing would slow or reverse this economic growth, while hurting consumers by
causing the demise of many of the data-driven online resources they rely on. Policies must strike a
balance between realistic consumer privacy preferences and access to information.

Should privacy protection depend on, or allow for, consumer variation in privacy preferences?
Why or why not? What are the appropriate tradeoffs to consider? If desired, how should this
flexibility be implemented?

Consumer variation in privacy preferences are no different than the variance in individual preferences for
any other type of good or service. The strength and vibrancy of the American economy is predicated on a
choice-based market architecture that optimizes the distribution of scarce resources to their highest
leveraged uses.

What are the effects, if any, on competition and innovation from privacy interventions, including
from policies such as data minimization, privacy by design, and other principles that the
Commission has recommended?

Anticipatory regulatory frameworks that attempt to address privacy concerns by relying on broad one-
size-fits-all rules will inevitably comet at the expense of both innovators and consumers.

Data minimization and purpose-limitation mandates make it far more difficult to transmit information
between firms, industries, and national borders. 53 The impact of such rules would have negative

49 Id.
50 Id.
51 See, Curt Levey and Ashley Baker, Letter for the Record to Members of the House Committee on Energy and
Commerce on Facebook, Transparency, and Use of Consumer Data. 10 April 2018,
https://www.committeeforjustice.org/single-post/Letter-for-the-Record-to-Members-of-the-House-Committee-on-
Energy-and-Commerce-on-Facebook-Transparency-and-Use-of-Consumer-Data.
52 John Deighton and Peter Johnson, “The Value of Data 2015: Consequences for Insight, Innovation and Efficiency

in the U.S. Economy.” Data & Marketing Association. Dec. 2015, http://thedma.org/advocacy/data-driven-marketing-
institute/value-of-data/.
53 See, Sarah Wheaton, "5 BIG Reasons Europe Sucks at Curing Cancer," Politico, 12 Oct. 2018,

https://www.politico.eu/article/cancer-5-big-reasons-europe-sucks-at-curing/.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 1 7


consequences for every sector of the economy that makes use of data and the ripple effect would be felt
across the entire economy.54

Similarly, mandating opt-in default architectures imposes very real economic costs to firms and
researchers. When platforms have to obtain affirmative consent, companies have less money to invest in
research and development.

Furthermore, such requirements favor incumbent firms already known to users, regardless of the actual
data protection frameworks actually implemented in practice. 55

Opt-in can lead to less information sharing not because people who genuinely value privacy are
no longer allowing their personal data to be traded, but rather because companies may find it too
expensive to administer an opt-in program and because, due to inertia, people simply accept the
opt-in no-sharing default regardless of their privacy preferences. An opt-in rule would therefore be
inefficient because it could discourage too many individuals from participating. 56

Large companies can typically survive these decreases in revenue and increased compliance costs, while
smaller companies may no longer be able to operate. Public debate is disproportionally focused on large
companies, but the vast majority of Internet companies fall in the latter category and include the very
companies that might otherwise grow to compete with and even supplant the other tech giants of today.

These effects are not merely hypothetical. Indeed, one need only look at the devastating impact of
restrictive regulations in the EU. For example, following the implementation of the opt-in model mandated
in the EU’s Privacy and Electronic Communications Directive (2002/58/EC), online ads became 65
percent less effective.57 This is also one of the reasons for the absence of tech startups in Europe.58 The
inability to generate online revenue and to develop new products forms a roadblock for venture capital
investments.

Accordingly, sweeping ex ante regulatory approaches like GDPR, and the recently-passed California
Consumer Privacy Act, are also likely to create an artificial imbalance in the competitive ecosystem in
which many firms operate. This imbalance is likely to result anticompetitive lock-in effects for incumbent
firms. Unlike their resource-lean startup counterparts, large companies are far better situated to devote
labor costs and time to addressing the increased compliance costs necessitated by broad data protection
mandates like the GDPR. In other words, as privacy-based compliance costs in a given digital market

54 The GDPR, for example, would have made it impossible for the Danish Cancer Society to conduct the study that
helped dispel the myth of a correlation between mobile cellular phone use and cancer. See Patrizia Frei et al., "Use of
Mobile Phones and Risk of Brain Tumours: Update of Danish Cohort Study,” BMJ, 20 Oct. 2011,
https://www.cancer.dk/dyn/resources/File/file/9/1859/1385432841/1_bmj_2011_pdf.pdf.
55 Alec Stapp and Ryan Hagemann, Comments submitted to the National Telecommunications and Information
Administration in the Matter of: Request for Comments on Developing the Administration’s Approach to Consumer
Privacy, Docket Number 180821780-8780-01, submitted 8 Nov. 2018, p. 4,
https://www.ntia.doc.gov/files/ntia/publications/niskanen_center.pdf. (“Opt-in choice architecture is inferior to opt-out
because it is biased toward incumbents. Users might be more willing to affirmatively give consent to businesses they
already know, even if a newer company with less brand recognition has the same or better data security practices.
Any data accountability frameworks, regulations, or principles should expressly disavow a mandatory default opt-in
regime for data collection.”)
56 Robert W. Hahn and Anne Layne-Farrar, “The Benefits and Costs of Online Privacy Legislation,” AEI-Brookings
Joint Center for Regulatory Studies, Working Paper 01-14. Oct. 2001, p. 58,
http://papers.ssrn.com/abstract_id=292649.
57Alan McQuinn. "The Economics of 'Opt-Out' Versus 'Opt-In' Privacy Rules." Information Technology and Innovation
Foundation. 6 Oct. 2017, https://itif.org/publications/2017/10/06/economics-opt-out-versus-opt-in-privacy-rules.
58 Mark Scott. "For Tech Start-Ups in Europe, an Oceanic Divide in Funding." The New York Times. 19 January 2018.
https://www.nytimes.com/2015/02/14/technology/for-tech-start-ups-in-europe-an-oceanic-divide-in-funding.html.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 1 8


increase, the level of new entrants to, and competition among firms within, that market is likely to
decline.59

Prioritizing opt-out regimes could be a more efficacious mean of effectively balancing protections for
consumers with privacy concerns against the need for flexibility in new business models whose data
collection practices may prove far more appealing to the broader public.

Some academic studies have highlighted differences between consumers’ stated preferences on
privacy and their “revealed” preferences, as demonstrated by specific behaviors. What are the
explanations for the differences?

As the FTC accurately noted in its own background information for this hearing, “consumers have
expressed concern about the growing collection and use of their data, and businesses have enhanced
their ability to link consumers’ behavior across devices and platforms.” And yet an "expression of
concern" does not necessarily correlate to a subsequent action consumers may take to address those
concerns.

Indeed, the wealth of academic literature and behavioral experiments examining the gap between stated
and revealed consumer preferences vis-a-vis privacy clearly shows that consumers, on the whole, do not
value their online privacy more than access to zero-cost online services. Why the disconnect?

These differences could be the result of privacy being just one component of a more complex bundle of
values, the contextual nature of consumer valuations that occur in the moment and are subject to change
on a whim, and a general lack of consensus regarding what constitutes the “boundaries” of ownership
over individual data.60 In comments to the NTIA, Alec Stapp and Ryan Hagemann summarized these
effects:

Most research and behavioral studies conclude that privacy is highly context-dependent. Privacy
valuations are subject to cognitive biases, including social desirability bias (e.g., people are less
likely to share embarrassing information) and the endowment effect. Most people care a great
deal about privacy harms that result in material and financial costs, such as identity theft, or the
revelation of sensitive personal information to their close social circles. They tend to care far less
about data collected about their purchasing patterns and website browsing activity by companies

59 Susan E. McGregor and Hugo Zylberberg, “Understanding the General Data Protection Regulation: A Primer for
Global Publishers,” Tow Center for Digital Journalism at Columbia University (New York, NY: Mar. 2018), pp. 37-38,
https://doi.org/10.7916/D8K08GVB. (“As of 2017, Google and Facebook claim seventy-seven cents of every dollar
spent on digital advertising in the United States, with no other single company claiming even as much as three
percent of the total market share. While the GDPR may hinder some of these companies’ data collection and/or
sharing activities, the regulation may well squeeze smaller advertising networks even more, potentially magnifying the
dominance of this duopoly in online advertising. These smaller ad networks, for example, typically lack the direct
consumer relationships needed to secure consent from users on their own behalf, but may also find that media
publishers and other website hosts are reluctant to ask for user consent for the broad range and volume of data that
these advertisers can presently access without hindrance. Without access to the data on which they currently rely,
smaller advertising networks may be simply cut out of the online market altogether unless they can find a way to gain
some advantage over the platforms in compliance, user-friendliness, or rates. In this environment, platform
companies and website hosts—such as media companies—that have a brand-name relationship to their users are
likely to have more success in persuading individuals to give up their information, and therefore may have increased
power in the advertising market under the GDPR.”)
60 Alec Stapp and Ryan Hagemann, Comments submitted to the Federal Trade Commission in the Matter of: Hearing
on Competition and Consumer Protection in the 21st Century: The Intersection Between Privacy, Big Data, and
Competition, Docket Number FTC-2018-0051, Project Number P181201, submitted 20 Aug. 2018,
https://niskanencenter.org/wp-content/uploads/2018/08/Comments-Privacy-Big-Data-and-CompetitionFTC.pdf.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 1 9


storing that information on distant, largely-inaccessible data server farms. This is especially true
when consumers receive what they judge to be considerable benefits at a functional cost to them
of zero dollars and zero cents.61

Public opinion polls showing support for stronger data protections are misleading because they rarely
confront consumers with the monetary of and other costs of their choices. 62 A 2016 study found that,
despite most participants’ unease with an email provider using automated content analysis to provide
more targeted advertisements, 65 percent of them were unwilling to pay providers any amount for a
privacy-protecting alternative.63

However, in the real world, consumers will lose free email and social media if government-imposed
privacy regulations cut into providers' advertising revenue. Moreover, such studies remind us that most
consumers do not value data privacy enough to pay anything for it.

That should not be too surprising considering that today's thriving but largely unregulated social media
ecosystem is not something that was thrust upon consumers or arose from factors beyond their control.
Instead, it arose through the collective choices and values tradeoffs of billions of consumers.

CONCLUSION

In a 2014 Foreign Affairs essay, Craig Mundie considers what might have become of the digital economy
“if, in 1995, comprehensive legislation to protect Internet privacy had been enacted.”

Such policies, Mundie concludes, would have utterly failed to anticipate the complexities that arose after
the turn of the century with the growth of social networking and location-based wireless services. The
Internet has proven useful and valuable in ways that were difficult to imagine over a decade and a half
ago, and it has created privacy challenges that were equally difficult to imagine. Legislative initiatives in
the mid-1990s to heavily regulate the Internet in the name of privacy would likely have impeded its growth
while also failing to address the more complex privacy issues that arose years later.64

As the FTC continues to consider the issue of consumer privacy in the digital age, it would do well to
embrace a policy of restraint and forbearance.

61 Alec Stapp and Ryan Hagemann, Comments submitted to the National Telecommunications and Information
Administration in the Matter of: Request for Comments on Developing the Administration’s Approach to Consumer
Privacy, Docket Number 180821780-8780-01, submitted 8 Nov. 2018, p. 10,
https://www.ntia.doc.gov/files/ntia/publications/niskanen_center.pdf.
62Alan McQuinn, "The Economics of 'Opt-Out' Versus 'Opt-In' Privacy Rules." Information Technology and Innovation
Foundation. 6 Oct. 2017, https://itif.org/publications/2017/10/06/economics-opt-out-versus-opt-in-privacy-rules.
63 Lior Jacob Strahilevitz and Matthew B. Kugler. “Is Privacy Policy Language Irrelevant to Consumers?" The Journal
of Legal Studies 45, no. S2. Sept. 9, 2016. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2838449.
64 Craig Mundie, “Privacy Pragmatism,” Foreign Affairs, Vol. 93, No. 2 (March/April 2014), p. 517,
http://www.foreignaffairs.com/articles/140741/craig-mundie/privacy-pragmatism.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 2 0


APPENDIX B: SUMMARY OF COMMENTS FILED WITH THE NATIONAL
TELECOMMUNICATIONS AND INFORMATION ADMINISTRATION

CFJ Files Comments with NTIA on Developing the Administration’s Approach


to Consumer Privacy

On Friday, November 9, the Committee for Justice (CFJ) responded to the National Telecommunications
and Information Administration’s (NTIA) Request for Comments on Developing the Administration’s
Approach to Consumer Privacy. CFJ's comments emphasize the need to prioritize economic prosperity
and preserve the United States' role as leader in technological innovation by learning from the disastrous
results of privacy regulations abroad.

Key recommendations include:

• Many of the recent privacy proposals wouldn’t protect consumers, but would make America
more like Europe. The United States’ economic growth and status as a global leader in innovation will
depend on a thorough evaluation of risks when crafting our nation’s approach to consumer privacy. As
calls for data privacy in the United States echo those heard in Europe, it is important to remember the
fate of the European Union’s digital economy at the hands of a strict regulatory regime. We should learn
from their mistakes.

• Data protection policies could deter venture capital investments and strangle U.S.-based tech
start-ups.The EU’s Directive 2002/58/EC, which mandated an opt-in policy to obtain affirmative
consent, is an unfortunate example of this. Additionally, as a result of these measures, small companies
have less money to invest in research and development for new products and services and may even
shut down. Opt-in policies are also illogical since the knowledge that privacy settings can be changed
acts as a form of affirmative consent.

• Data privacy concerns should not be confused with the constitutional right to privacy found in
the Third, Fourth, and Fifth Amendments—which protect us from government intrusions—or
even the common law and statutory protections available when a private actor coercively
violates our privacy. The public debate often conflates the true privacy rights that protect us from
involuntary intrusions by the government and private actors with proposed privacy policies affecting the
data we voluntarily convey to tech platforms. This conflation has been made worse by the European
Union, which has labeled its package of privacy policies as a fundamental right, even though many of
those policies are at odds with the free speech and economic rights prized by Americans (for example,
see the EU’s “Right to Be Forgotten”). This is a very important distinction to maintain.

• The Federal Trade Commission (FTC) already has the appropriate statutory authority to protect
consumer privacy. The FTC should continue its role as the safeguard against unscrupulous data
practices. Rushed attempts to implement a federal privacy policy are unnecessary since the FTC has
proven to be an effective policeman. As for changes with regard to process, it could be helpful for the
FTC to develop guidelines to determine the need to bring an enforcement action, especially as the data
ecosystem expands with the Internet of Things (IoT). However, this should only be done after the careful
evaluation of public input.

• The Administration should pay particular attention to proposed state regulations that threaten to
create a patchwork of regulations that could strangle new businesses and technologies with
contradictory laws and enforcement. When faced with compliance and financial burdens, new

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 2 1


technology companies—and the tax revenue and job creation they produce—tend to move to favorable
regulatory environments. Since technology, by nature, cannot be confined within state borders, these
companies are more likely to choose to operate outside of the United States.

• When crafting a data protection framework, it is especially important that our government has an
understanding of the unique features of emerging technologies in order to avoid ill-suited or
unnecessary regulations that would impede their adoption. For instance, the protection of privacy in
AI systems can be facilitated by the “black box” nature of machine learning combined with careful
handling of the training data sets used. If those data sets are properly disposed of once the learning
phase is complete, the neural network capture the knowledge they need to perform without preserving
any of the individual data that could compromise privacy.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 2 2


APPENDIX C: TEXT OF COMMENTS FILED WITH THE NTIA
RE: Developing the Administration’s Approach to Consumer Privacy
DOCKET ID: 180821780-8780-01
SUBMITTED: NOVEMBER 9, 2018

ASHLEY BAKER
DIRECTOR OF PUBLIC POLICY
THE COMMITTEE FOR JUSTICE

INTRODUCTION
Founded in 2002, the Committee for Justice (CFJ) is a nonprofit legal and policy organization that
promotes and educates the public and policymakers about the rule of law and the benefits of
constitutionally limited government. Consistent with this mission, CFJ advocates in Congress, the courts,
and the news media about a variety of law and technology issues, encompassing administrative law and
regulatory reform, free speech, data privacy, and antitrust law.

CFJ has a long history of leadership on the issue of federal judicial nominations and the confirmation
process in the Senate. Our voice and influence are amplified during confirmation battles for judicial
nominees and the period of close analysis of their rulings that inevitably follows, giving us a unique and
high-profile platform to focus attention on issues at the intersection of law and technology by highlighting
how those issues will be impacted. For example, CFJ recently submitted a letter to the Senate Judiciary
Committee explaining why the confirmation of Supreme Court Justice Brett Kavanaugh would be good for
technological innovation and the economic growth it spurs.65

In the past year, CFJ has actively advocated for digital privacy protections in Congress, the federal
courts, and the Supreme Court.66 Today, our focus is on innovation, free speech, and economic growth.
We believe that restrictive new requirements for data collection and use are not only unwarranted but
would also threaten the online ecosystem that has transformed our daily lives in recent decades.

RECOMMENDATIONS

Are there other outcomes that should be included, or outcomes that should be expanded upon as
separate items? Are the descriptions clear? Beyond clarity, are there any issues raised by how
any of the outcomes are described? Are there any risks that accompany the list of outcomes, or
the general approach taken in the list of outcomes?

65 The Committee for Justice, Letter to the Senate Judiciary Committee in Support of Brett Kavanaugh. 4 Sept. 2018,
https://docs.wixstatic.com/ugd/3bb067_f0fe37f564ac4afb8ff8c688a84faa21.pdf.
66 See, e.g, amicus briefs filed in Carpenter v. United States (August 2017),
https://www.scribd.com/document/356288790/Amicus-Brief-Filed-in-Carpenter-v-United-States and United States v.
Kolsuz (March 2017), https://www.scribd.com/document/355249553/United-States-v-Kolsuz-Amucis-Brief; letter to
Congress in support of the CLOUD Act (March 2018), https://www.committeeforjustice.org/single-post/support-
clarifying-lawful-use-data.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 2 3


The United States’ economic growth and status as a global leader in innovation will depend on a thorough
evaluation of risks when crafting our nation’s approach to consumer privacy. As calls for data privacy in
the United States echo those heard in Europe, it is important to remember the fate of the European
Union’s digital economy at the hands of a strict regulatory regime.

The European Union's Directive 2002/58/EC 67 is an unfortunate example of this. The rule mandated an
opt-in policy requiring businesses to obtain affirmative consent from consumers before collecting and
processing data about them, because they believe such a requirement is necessary to ensure people
have full control of their personal information.

In the recent debate over data privacy in the United States, many proposals have included an opt-in
policy. The decision to include a similar measure would have huge implications for the availability and use
of data in the ad-based revenue model that is the lifeblood of the online ecosystem. When platforms have
to obtain affirmative consent, companies have less money to invest in research and development for new
products and services and may even shut down.

Although a reduction in advertisements and data use may initially sound appealing to the Administration,
the prospect of becoming more like Europe undoubtedly does not. After Europe implemented this opt-in
model, online ads became 65% less effective.68 It is also one of the reasons for the dearth of tech
startups in Europe.69 The inability to generate online revenue and to develop new products forms a
roadblock for venture capital investments.

Although privacy fundamentalists stress the necessity of opt-in notifications, a recent poll indicates that 74
percent of Facebook users are aware of their current privacy settings, and 78 percent said they knew how
to change them.70 Therefore, opt-in policies would not only harm small businesses, they are also based
on the falsehood that most American consumers are unwittingly opting for lesser privacy protections.

This decision has huge implications for the availability and use of data in the online ecosystem that is built
on the financial model of online ads that run off this information. When platforms have to obtain
affirmative consent, companies have less money to invest in new products and services and can even be
forced to shut down. Opt-in policies are also less user-friendly, and they are designed to meet the
demands of a small group of privacy advocates. The only difference is the economic impact.

Should the Department convene people and organizations to further explore additional
commercial data privacy-related issues? If so, what is the recommended focus and desired
outcomes?

It is especially important that our government has an understanding of the unique features of emerging
technologies in order to avoid ill-suited or unnecessary regulations that would impede their adoption. For
instance, the protection of privacy in AI systems can be facilitated by the “black box” nature of machine
learning combined with careful handling of the training data sets used. If those data sets are properly
disposed of once the learning phase is complete, the neural network capture the knowledge they need to
perform without preserving any of the individual data that could compromise privacy.

67 OJ L 201, 31.7.2002, p. 37–47, ELI: http://data.europa.eu/eli/dir/2002/58/oj.


68Alan McQuinn, "The Economics of 'Opt-Out' Versus 'Opt-In' Privacy Rules." Information Technology and Innovation
Foundation. 6 Oct. 2017, https://itif.org/publications/2017/10/06/economics-opt-out-versus-opt-in-privacy-rules.
69 Mark Scott, "For Tech Start-Ups in Europe, an Oceanic Divide in Funding." The New York Times. 19 Jan. 2018,
https://www.nytimes.com/2015/02/14/technology/for-tech-start-ups-in-europe-an-oceanic-divide-in-funding.html.
70 Reuters/Ipsos poll. Three-quarters Facebook users as active or more since privacy scandal. May 2018,
https://www.reuters.com/article/us-facebook-privacy-poll/three-quarters-facebook-users-as-active-or-more-since-
privacy-scandal-reuters-ipsos-poll-idUSKBN1I7081.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 2 4


An effective approach would also pay particular attention to proposed state regulations that threaten to
create a patchwork of regulations that could strangle new businesses and technologies with contradictory
laws and enforcement. When faced with compliance and financial burdens, new technology companies—
and the tax revenue and job creation they produce—tend to move to favorable regulatory environments.
Since technology, by nature, cannot be confined within state borders, these companies are more likely to
choose to operate outside of the United States.

Do any terms used in this document require more precise definitions? Are there suggestions on
how to better define these terms? Are there other terms that would benefit from more precise
definitions? What should those definitions be?

While consumer privacy is an important concern of our legislators and regulators, it should not be
confused with the constitutional right to privacy found in the Bill of Rights’ Third, Fourth, and Fifth
Amendments –which protect us from government intrusions – or even the common law and statutory
protections available when a private actor coercively violates our privacy, say by breaking into our
computer. Although there is a clear legal distinction in the United States, the public debate often conflates
the true privacy rights that protect us from involuntary intrusions by the government and private actors
with proposed privacy policies affecting the data we voluntarily convey to tech platforms.

This conflation has been made worse by the European Union, which has labeled its package of privacy
policies as a fundamental right, even though many of those policies are at odds with the free speech and
economic rights prized by Americans (for example, see the EU’s “Right to Be Forgotten”). The
Administration needs to avoid conflation of true privacy rights and proposed privacy policies because
failure to do so can a.) lead to legislation or regulations that unnecessarily increase the very intrusion and
excessive executive power that the Bill of Rights’ privacy protections were aimed against, and b.) cut off
the debate and balancing that is needed to weight the benefits of those policies against the harm they can
do to American innovation and leadership in the online ecosystem and the economic growth and
consumer choices that has spurred.

One of the high-level end-state goals is for the FTC to continue as the Federal consumer privacy
enforcement agency, outside of sectoral exceptions beyond the FTC’s jurisdiction. In order to
achieve the goals laid out in this RFC, would changes need to be made with regard to the FTC’s
resources, processes, and/or statutory authority?

No changes to statutory authority are necessary because consumer data is protected by the Federal
Trade Commission's vigorous enforcement of its data privacy and security standards using the prohibition
against “unfair or deceptive” business practices in Section 5 of the Federal Trade Commission Act 15
U.S.C. §45(a), The FTC has already proven to be an effective safeguard against unscrupulous data
practices.71 While some would argue that without formal rulemaking authority the FTC cannot adequately
protect consumers, past examples prove the contrary. FTC enforcement protects against identifiably
harmful practices, not potential future harm.

For example, the FTC’s complaint against Sequoia One alleged that the company sold the personal
information of payday loan applicants to non-lender third-parties and one of these third parties used the
information to withdraw millions of dollars from consumers’ accounts without their authorization. 72 This is

71
See, e.g, Federal Trade Commission. FTC Staff Report: Self-regulatory Principles for Online Behavioral
Advertising. 2009. https://www.ftc.gov/reports/federal-trade-commission-staff-report-self-regulatory-principles-online-
behavioral; Federal Trade Commission. Privacy Online: Fair Information Practices in the Electronic Marketplace.
2000. http://www.ftc.gov/reports/privacy2000/privacy2000.pdf.
72Federal Trade Commission. FTC Puts An End to Data Broker Operation that Helped Scam More Than $7 Million from
Consumers’ Accounts. 2016. https://www.ftc.gov/news-events/press-releases/2016/11/ftc-puts-end-data-broker-operation-
helped-scam-more-7-million.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 2 5


just one case in which the FTC has shown a willingness to bring enforcement actions against companies
that sell their analytics products to customers if they know or have reason to know that those customers
will use the products for illegal purposes.

While the FTC’s statutory authority is adequate, it is not known whether future resources may be needed
in order to provide the agency with technical ability and required expertise. This is something the NTIA
could evaluate. As for changes with regard to process, it could be helpful for the FTC to develop a “test”
or set of guidelines that would determine the need to bring an enforcement action. This could be helpful in
providing efficient protection as the data ecosystem expands with the Internet of Things (IoT). However,
this should only be done after the careful evaluation of public input.

CONCLUSION

To fundamentally address the current privacy concerns about the Internet, we really would need to start
over from scratch. That's because the privacy problems have their roots in decisions made and directions
taken decades ago concerning the Internet's technical structure and the business model that supports
most of the enterprises on the world wide web.

When the Internet was conceived and designed 50 years ago, the goal was to make the flow of data easy
and virtually indiscriminate in both directions – that is, sending and receiving. The Internet privacy
problem arises from the successful achievement of that goal. Contrast that with television and radio,
which has a one-way flow, or traditional telephony, in which only a limited amount of information flows
back to the service provider.

In the 1990s, when the world wide web emerged and made the Internet a household word, people
wondered how the exploding number of websites were going to convert their popularity into profitability
and sustainability. The answer turned out to be, for the most part, selling advertising. It was inevitable that
web sites would sell their competitive advantage – that is, access to user data – to advertisers, which
provided the second necessary component for today's privacy problem. With an open Internet
architecture and a business model driven by user data, it was just a matter of time and growth until
today's controversies erupted.

That said, it is not feasible to start over from scratch. The open, two-way architecture of the Internet is
baked in and it is hard to see how any substantial change would be possible. Business models evolve
slowly rather than abruptly, so an end to websites' reliance on user data-driven advertising is not
something we'll see in the next decade if ever. With the two big enablers of today's privacy concerns here
to stay, if the United States to continue its role as a leader of technological innovation enjoy the economic
prosperity that it creates, we are stuck with the technological ecosystem that we currently have. Trying to
reinvent the wheel through data privacy regulations would make the United States less great and more
like Europe. It is best to proceed with caution and learn from the mistakes and failures of others abroad.

The Committee for Justice Contact: 202.270.7748 | [email protected] Page | 2 6

You might also like