Search, Speech, and Secrecy: Corporate Strategies for Inverting Net Neutrality Debates

Bottlenecks at any layer of the Internet—physical, social, application, or content[1]—create opportunities for the exercise of undue power over the flow of information and ideas online. Corporate forces menace both user privacy and free expression on the Internet. Market concentration lets powerful business leaders develop unprecedented digital dossiers on users. Such concentration also allows leading companies to pervasively shape culture and politics, elevating some voices and silencing others.

The privacy and First Amendment cases for net neutrality are compelling.[2] But net neutrality’s opponents are inverting the debate by asserting their own rights to “free speech” and asserting a form of corporate privacy-trade secrecy. While Christopher Yoo’s theory of “architectural censorship”[3] has not yet vindicated Internet Service Providers’ (ISPs) claims to expansive First Amendment rights, ISPs’ business alliances with search engines and other internet companies that monitor content may make their decisions sufficiently “speech-like” to attract protection from the current Supreme Court, thereby insulating ISP decisions from regulation and scrutiny. Classifying network management decisions as “trade secrets” could also hamper public scrutiny of and regulatory attention to ISPs’ actions. Corporate trade secrecy privileges would make it difficult for consumers to determine if their privacy rights have been violated. Corporate First Amendment protections would likely foil lawsuits (and even some regulation) designed to promote the public interest by ensuring fair, open, and neutral ordering of data flows online.

Legal argument may stop this inversion. First Amendment and trade secrecy doctrines, as applied to ISPs and seach engines, are vague, and there may be openings to establish new legal doctrines on this front that would promote the free flow of information online. But given Google’s success in advancing its legal interests,[4] the recent collaboration between Google and Verizon in developing a “legislative framework” for network neutrality,[5] and the present composition of the Roberts Court, consumer advocates who care about individuals’ rights to privacy and free expression should start moving beyond the legal system to develop more transparent and open alternatives to increasingly unregulable networks of dominant online intermediaries.

I. Hidden Maps of the Internet

As British Petroleum’s (BP) massive oil spill dominated cable news in the summer of 2010, Internet companies presented users with many different perspectives on BP. Because of personalization of search results for certain users, searches for “BP” on Google, for example, led some users to links about investment opportunities in the company, while leading others to fierce denunciations of BP’s environmental track record. Advancing personalization of search results accelerates a shift from “broadcast” to “narrowcast” content. It also foreshadows more troubling trends in the digital public sphere.

Internet intermediaries govern online life. ISPs and search engines are particularly central to online ecology.[6] Users rely on search services to map the web—and the world—and use ISPs to connect to one another. Companies like Google, Verizon, and Comcast frequently portray their services as efficient, scientific, and neutral methods of depicting and transporting communication and information. Yet metaphors of transport and mapping are inadequate and even misleading in the case of online information flows. Though ISPs might describe their services as essentially moving objects from one place to another, delivering bits and ranking websites is in fact far more complex than getting from Point A to Point B. That complexity can often allow Internet intermediaries to conceal their methods and operations from public scrutiny.

Governments are beginning to realize the role they must play in helping the public to understand information flows online. Google’s secrecy about its website-ranking algorithm has provoked investigations in Europe,[7] and the New York Times editorial page recently called for similar scrutiny in the U.S.[8] The Federal Communications Commission (FCC) found that Comcast secretly blocked customers’ access to legal websites.[9] The Agency did not discover the problem itself: A dogged engineer and investigative reporters spent weeks sleuthing to uncover the problem.[10] The average customer is not capable of detecting such manipulation.

It is not just searchers whose queries might be arbitrarily and unknowably limited. The stakes are even higher for those who want to be found online. Search engines are referees in the millions of contests for attention that take place on the web each day. There are dozens of entities that want to be the top result in response to a query like “sneakers,” “best Thai restaurant,” or “florist.” A top or twentieth-ranked result can be the difference between lucrative gigs and obscurity. The top and right-hand sides of many search engine pages are open for paid placement, but even there the highest bidder may not get a prime spot as secret auctions often determine the price and prominence of paid ads. “Search engine optimizers” can sometimes raise a site’s ranking, but if they get too aggressive, they can provoke the storied “Google death penalty” for their clients, which can result in a company’s complete removal from the search engine index.[11] Google gives rough guidance on what constitutes legitimate search engine optimization, but claims it cannot be very specific lest it encourage “gaming” of its algorithms.[12] Many companies claim they have been effectively “disappeared” from search results inexplicably.[13]

Of course, the situation in the United States is not nearly as bad as that in China, where leading search firm Baidu has bluntly told firms to either buy ads or disappear from unpaid results.[14] American consumers would probably punish that kind of blocking: When an Internet connection is dropped, or a search engine fails to produce a result the searcher knows exists somewhere on the web, such failures are obvious. However, most web experiences do not unfold in such a binary, pass-fail manner. An ISP or search engine can slow down the speed or reduce the ranking of a website in ways that are very hard for users to detect. And if a user were to question a suspicious practice, ISPs might claim First Amendment rights to order information in whatever way they deem fit, and invoke trade secrecy protections to block scrutiny of those decisions.[15]

The power of Internet intermediaries implicates several traditional concerns of the American legal system.[16] Businesses have sued both Google and ISPs on the basis of consumer protection and competition law.[17] But in cases involving business tort claims, Google has already won twice, claiming that its rankings are merely an “opinion” about the relevance of websites to a given query.[18] Harvard Law Professor Laurence Tribe has advanced a theory similar to Google’s about the activities of broadband providers. Tribe has characterized the “constitution’s enduring value” as “protecting all private groups, large and small, from government,”[19] and claims broadband providers’ network management practices deserve just as much protection from government interference as citizen speech.[20]

First Amendment law is not yet that solicitous of network operators, permitting the FCC to impose many obligations on them that it could not impose on ordinary speakers.[21] Nevertheless, as network operators increasingly integrate their services with smaller entities—such as search engines, blogs, and other content providers—that could be accorded broader First Amendment rights, Tribe’s reasoning may tempt the courts to overturn net neutrality and other regulations that may be imposed on carriers down the road.

II. The Challenges of Identifying Abuses: Trade Secrecy, State Secrecy, and Online Reputation

Americans make a Faustian bargain online: In return for faster access to information, we lose the ability to understand how that information is ordered and sorted online. This is not only a problem for companies that cannot determine how their search engine rankings are created. Nearly everyone’s digital persona is sliced, diced, and redefined on the basis of Internet data and clandestine sorting systems, and we have little sense of how marketers, law enforcement officials, or other decision makers are using the data endlessly gathered and repackaged online. When we try to protect our privacy and reputation by claiming some right to understand how online profiles are created and shared, we will likely find corporations countering that their own privacy protections—the propertized confidences of trade secrecy—should prevent public scrutiny of the algorithms they use to generate online reputations.

In popular books like Ian Ayres’s Super Crunchers[22] and Stephen Baker’s The Numerati,[23] data-driven decision making is celebrated as a cornerstone of future advances in productivity. However, the individual who is the object of such number-crunching may fear being misclassified, misunderstood, or worse. Internet intermediaries not only monitor and track what users do, but also generate reputations based on that data. While employment and credit-reporting law regulates some entities that generate and rely on background checks,[24] no similar restrictions apply to many sources of online data. Even sensitive health data has been “scraped,” or copied, by research firms when consumers use forums like “”[25] Unaccountable data exchanges ensure that information from myriad online sources can be funneled into the databases of marketers and law enforcement officials.

Warrantless wiretapping is just the tip of an iceberg of new domestic intelligence programs that rely on private companies to act as “big brother’s little helpers.” As the national surveillance state grows, entities called “fusion centers” have collapsed traditional distinctions between government and corporate surveillance by inventive procurement schemes and co-location of law enforcement officials with private sector employees. As the American Civil Liberties Union has documented,[26] inadequate oversight has led to significant abuses of civil liberties by a shadowy domestic intelligence apparatus. Yet fusion centers’ practices and priorities remain hidden from traditional agents of accountability.[27] A new and powerful combination of trade secrecy and state secrecy makes it very difficult for ordinary citizens to understand how they are being profiled. If left undetectable, abuses can never be deterred.

III. Toward a Possible Solution: Lessons from Finance and National Security Policy

David Brin’s book The Transparent Society argues that digital storage, now used to erode individual privacy, could be turned on the “watchers” by the “watched.”[28] Brin offers a promising vision for enhancing governmental accountability, but its applicability to the private sector is limited. Trade secrets are treated like property in many jurisdictions. Mandatory disclosure of such secrets can be deemed a taking, requiring fair compensation.

Fortunately, we do not face an impossible choice between full disclosure and absolute secrecy. With the right tools, a private publication—or “privication,” as Jonathan Zittrain calls it[29]—can make data available to a limited number of people. There are already models that can serve as useful starting points. For example, U.S. national security law has promoted “qualified transparency” with the Foreign Intelligence Surveillance Court, which hears petitions from the U.S. Attorney General seeking authorization to conduct certain forms of surveillance.[30] Though the court has been criticized as a rubber stamp, it at least requires an independent actor to understand and approve surveillance actions.

Another model for increased Internet scrutiny might be found in the U.S. Treasury Department, which is now establishing an Office of Financial Research that some have called the “CIA of Finance.” Like intelligence agencies that have broad investigative powers to spot terrorist threats, the Office of Financial Research will collect and analyze details of financial transactions in order to spot “systemic risk” (e.g., debts or bets that would menace the entire financial system if defaults were to occur).[31]

For this reason alone, Congress and regulators should consider establishing similar offices to record and analyze the decisions of dominant Internet intermediaries. I have called for an independent agency to research and issue reports on suspect practices at search engines and carriers.[32] New laws and technologies of accountability are needed to record the snooping done by the powerful[33] and subject their surveillance itself to scrutiny.[34] I look forward to seeing ongoing efforts to promote privacy via design principles[35] and interdisciplinary engagement in this field.[36]


[1]Jonathan Zittrain, The Future of the Internet and How To Stop It 67 (2008) (describing a “‘physical layer,’ the actual wires or airwaves over which data will flow;” an “‘application layer,’ representing the tasks people might want to perform on the network;” a “‘content layer,’ containing actual information exchanged among the network’s users;” and a “‘social layer,’ where new behaviors and interactions among people are enabled by the technologies underneath”).

[2]For a discussion of the variety of policies that come under the heading of net neutrality, see Frank Pasquale, Beyond Innovation and Competition: The Need for Qualified Transparency in Internet Intermediaries, 104 Nw. U. L. Rev. 105 (2010), available at For a discussion of the First Amendment case for network neutrality, see Daniel Solove, Bright Ideas: Nunziato on Virtual Freedom: Net Neutrality and Free Speech in the Internet Age, Concurring Opinions (May 3, 2010, 9:13 AM),… (“[B]roadband providers and wireless carriers should be prohibited from discriminating against speech on the basis of viewpoint or content. Just as telecommunications providers and the postal service have long been regulated as ‘common carriers’ and prohibited from engaging in content discrimination, so too should broadband providers be prohibited from discriminating against content in serving as communications conduits.”). See also Dawn Nunziato, Virtual Freedom: Net Neutrality and Free Speech in the Internet Age 150 (2009) (discussing Google’s and Internet Service Providers’ unfavorable treatment of several groups).

[3]Christopher Yoo, Architectural Censorship and the FCC, 78 S. Cal. L. Rev. 669 (2005).

[4]See Langdon v. Google, Inc., 474 F. Supp. 2d 622, 630 (D. Del. 2007) (holding that injunctive relief sought by plaintiff contravenes defendants’ First Amendment rights), available at; Search King, Inc. v. Google Tech., Inc., No. CIV-02-1457-M, 2003 WL 21464568, at *4 (W.D. Okla. May 27, 2003) (finding that there is no conceivable way to prove that the relative significance assigned to a given web site is false and, accordingly, concluding that Google’s Page Ranks are entitled to full constitutional protection).

[5]See Frank Pasquale, Rethinking Net Neutrality After the Verizon/Google Framework, Concurring Opinions (Aug. 10, 2010, 11:21 AM),… (noting the companies’ CEOs have stated that, in their view, “[a] provider that offers a broadband Internet access service complying with [basic net neutrality] principles” should be able to “offer any other additional or differentiated services” free of net neutrality regulation).

[6]For those interested in background on the similarities and differences between ISP and search engine influences, see Pasquale, supra note 2, at 107-115.

[7]See, e.g., Richard Waters, Unrest Over Google’s Secret Formula, Fin. Times, July 12, 2010, at 22, available at,dwp_uuid=… (“Prompted by three complaints, the European Commission this year began an informal investigation, the first time that regulators have pried into the inner workings of the technology that lies at the heart of Google.”).

[8]Editorial, The Google Algorithm, N.Y. Times, July 15, 2010, at 30, available at

[9]In particular, Comcast slowed-down access to internet protocols (torrents) that hosted large files. In re Formal Complaint of Free Press & Public Knowledge Against Comcast Corporation for Secretly Degrading Peer-to-Peer Applications, 23 F.C.C. Rcd. 13,028 (2008) [hereinafter FCC Comcast Decision] (memorandum opinion and order). The Court of Appeals for the D.C. Circuit later found that the Commission lacked authority to regulate an ISP’s network management practices. Comcast Corp. v. FCC, 600 F.3d 642, 644 (D.C. Cir. 2010), available at

[10]Daniel Roth, The Dark Lord of Broadband Tries To Fix Comcast’s Image, Wired, Feb. 2009, at 54, available at (“Comcast appeared to be blocking file-sharing applications by creating fake data packets that interfered with trading sessions. The packets were cleverly disguised to look as if they were coming from the user, not the ISP.”).

[11]See Frank Pasquale, The Troubling Trend Toward Trade Secrecy in Rankings and Ratings, in The Law and Theory of Trade Secrecy: A Handbook of Contemporary Research (Rochelle C. Dreyfuss & Katherine J. Strandburg eds.) (forthcoming 2010) (manuscript at 4-10) (on file with author) (describing Google’s practice of disfavoring aggressive search optimizers).

[12]Id. at 6.

[13]Id. at 11-12.

[14]Id. at 4.

[15]See generally id. (summarizing search engines’ trade secrecy arguments).

[16]See, e.g., Ellen P. Goodman, Stealth Marketing and Editorial Integrity, 85 Tex. L. Rev. 83, 89 (2006) (describing the relationship of digital “stealth marketing” to traditional advertising law).

[17]See Complaint, LLC v. Google, Inc., 693 F. Supp. 2d 370 (S.D.N.Y. 2010) (No. 09 Civ. 1400). TradeComet sued Google for alleged violations of the antitrust laws. The suit was dismissed on the basis of a forum selection clause. See LLC v. Google, Inc., 693 F. Supp. 2d 370 (S.D.N.Y. 2010), available at

[18]See Langdon v. Google, Inc., 474 F. Supp. 2d 622, 630 (D. Del. 2007) (finding that injunctive relief sought by plaintiff contravened Google’s First Amendment right of editorial discretion to choose what it did or did not want to say), available at; Search King, Inc. v. Google Tech., Inc., No. CIV-02-1457-M, 2003 WL 21464568, at *4 (W.D. Okla. May 27, 2003).

[19]Laurence Tribe, The Constitution in Cyberspace: Law and Liberty Beyond the Electronic Frontier, Electronic Privacy Information Center (1991), (“[N]othing about any new technology suddenly erases the Constitution’s enduring value of restraining *government* above all else, and of protecting all private groups, large and small, from government.”).

[20]See Frank Pasquale, Larry Tribe’s Lochner?, Concurring Opinions (Aug. 28, 2007, 9:24 PM),; Tribe, supra note 19 (“There are circumstances, of course, when non-governmental bodies like privately owned ‘company towns’ or even huge shopping malls should be subjected to legislative and administrative controls by democratically accountable entities, or even to judicial controls as though they were arms of the state—but … [i]t’s a fallacy to suppose that, just because a computer bulletin board or network or gateway is *something like* a shopping mall, government has as much constitutional duty—or even authority—to guarantee open public access to such a network as it has to guarantee open public access to a privately owned shopping center like the one involved in the U.S. Supreme Court’s famous *PruneYard Shopping Center* decision of 1980 … .”).

[21]See Frank Pasquale, Asterisk Revisited: Debating a Right of Reply on Search Results, 3 J. Bus. & Tech. L. 61 (2008), available at….

[22]Ian Ayres, Super Crunchers: Why Thinking-by-Numbers Is the New Way To Be Smart 10 (2007).

[23]Stephen Baker, The Numerati (2009).

[24]Robert Sprauge, Googling Job Applicants: Incorporating Personal Information into Hiring Decisions, 23 Lab. Law. 19, 38 (2007) (explaining how Internet searches allow prospective employers to discover candidate information that would be unavailable through traditional prescreening mediums); Robert Sprauge, Rethinking Information Privacy in an Age of Online Transparency, 25 Hofstra Lab. & Emp. L.J. 395, 399 (2008) (stating that the Internet provides employers with a compelling alternative to traditional prescreening techniques which are restricted by various laws); Thomas F. Holt, Jr. & Mark D. Pomfret, Finding the Right Fit: The Latest Tool for Employers, Metro. Corp. Couns., Nov. 1, 2006, at 29, available at… (discussing the legal implications of using Internet searches as a tool for screening job applicants).

[25]See Frank Pasquale, The Health Privacy Paradigm Shift: From Consent to Reciprocal Transparency, Concurring Opinions (Oct. 26 2010, 8:24 PM),….

[26]Michael German & Jay Stanley, ACLU, Fusion Center Update (2008),; Michael German & Jay Stanley, ACLU, What’s Wrong With Fusion Centers? (2007),

[27]Danielle Keats Citron & Frank Pasquale, Network Accountability for the Domestic Intelligence Apparatus, 61 Hastings L.J. (forthcoming 2011) (manuscript at 9), available at

[28]David Brin, The Transparent Society: Will Technology Force Us To Choose Between Privacy and Freedom? 333 (1999).

[29]Zittrain, supra note 1.

[30]The Foreign Intelligence Surveillance Court (FISC or FISA Court) is a secret court that consists of eleven district court judges, at least three of whom must live within twenty miles of the District of Columbia. See 50 U.S.C. § 1803(a) (Supp. I 2009); see also Oren Bracha & Frank Pasquale, Federal Search Commission? Access, Fairness, and Accountability in the Law of Search, 93 Cornell L. Rev. 1149, 1204 (2008) (advocating a regulatory institution akin to the FISA Court).

[31]Dodd-Frank Wall Street Reform and Consumer Protection Act, Pub. L. No. 111-203, §§ 111-12, 124 Stat. 1376, 1392-98 (2010), available at (describing the “Financial Stability Oversight Council,” also frequently referred to as the “Systemic Risk Council”). The Office of Financial Research promises one benefit to future historians and risk analysts: permanent archives of financial decision making. Records that could easily be lost during mergers, takeovers, and IT-system updates would be archived for a reasonable period of time.

[32]Frank Pasquale, Beyond Innovation and Competition: The Need for Qualified Transparency in Internet Intermediaries, 104 Nw. U. L. Rev. 105 (2010).

[33]See Jeff Jonas, Immutable Audit Logs, Jeff Jonas (Feb. 9, 2006, 9:25 PM),

[34]See Brin, supra note 29, at 52-53.

[35]See Julie Smith David & Marilyn Prosch, Extending the Value Chain To Incorporate Privacy by Design Principles, Identity In the Information Society (May 18, 2010), available at

[36]See, e.g., Profiling the European Citizen: Cross-Disciplinary Perspectives (Mireille Hildebrandt & Serge Gutwirth eds., 2008), available at….