THE LIMITS IN OPEN CODE: REGULATORY STANDARDS AND THE FUTURE OF THE NET

By Lawrence Lessig

ABSTRACT

This essay considers the effect of the open source software movement on government's ability to regulate the Net. Its claim is that an increase in open source software within the application space of the Internet decreases the government's power to regulate.

This is an essay about standards in the future of the Internet's governance. I begin with a distinction between two types of standards, and then continue with a reminder of a bit of history of the evolution of thought about regulation in cyberspace. I then draw upon this distinction and this history to suggest a question about the future of the Net's regulation. This question relates to the place of open source software in the future of the "application space" of the Internet. My argument is that open source software will make regulating cyberspace more difficult than it otherwise would be.

I. Standards

Distinguish between two sorts of standards: coordinating and regulating. A coordinating standard is a rule that facilitates an activity that otherwise would not exist. A regulating standard restricts behavior within that activity, according to a policy set by the regulators. A coordinating standard can be imposed from the top down, or emerge from the bottom up; a regulating standard is ordinarily imposed only from the top down. Driving on the right side of the road is a coordinating standard. A speed limit is a regulating standard. Coordinating standards limit liberty (drive on the right) to make an activity possible (driving); regulating standards limit liberty within that activity (speeding) to advance a regulatory end (safety or fuel conservation). We understand why an individual would want to deviate from a regulating standard; it is (often) hard to make sense of a desire to deviate from a coordinating standard.

Standards on a computer network are similarly coordinating and regulating. TCP/IP is a coordinating standard-it is a convention that makes exchange of information over the Internet possible.1 Space allocation on a network server is a regulating standard-it limits the storage space assigned to a particular user to allow many users to use the same storage resource.

Most of the important Internet standards to date have been coordinating standards-standards such as TCP/IP, FTP, and HTML. The Internet community has demonstrated well its ability to develop and deploy coordinating standards; this is the genius in organizations such as the Internet Engineering Task Force ("IETF").2 But in the future, most of the most significant debates about standards will be debates about regulating standards-about standards that allow the government to carry its policy choices into effect, whether or not those choices are the choices of bottom-up organizations like the IETF.

The Net's success with standards in the future, then, depends upon the standards at stake. And its success with coordinating standards will not necessarily entail a similar success with regulatory standards.

II. Regulability

That's the distinction; now the history. It's important that we remark how the debate about the regulation of cyberspace has changed. Three years ago the world was techno-libertarian. Frustrated sorts from our bureaucratic age looked to cyberspace as a place where regulation would not work, and hence as a place where people would be free. "Free" had two senses for these sorts-first, life in cyberspace was free from any regulation, and second, life there was free from regulation by government. Life in cyberspace, libertarians promised, was unregulated and unregulable. Behavior there was beyond the government's reach.

These were the ideas that defined first-generation thought about cyberspace and law. Law such as copyright was dead, lyricists such as John Perry Barlow sang.3 Law was fundamentally threatened, lawyers such as Post and Johnson warned.4 The Net would be a world where freedom reigned, and in some techno-Marxist way, governments would have no choice but to wither away.

These ideas did not go unchallenged. Rather, there were a few "crazies" around at the time who thought quite differently about regulation on the Net. I met two at a conference at Emory Law School three years ago, where they were busy challenging these then-commonplace ideas about the unregulated life of cyberspace.

One was then an assistant professor from Fordham: Joel Reidenberg. About the claim that life in cyberspace was free-unregulated at all-Reidenberg had a very different view. Life in cyberspace, Reidenberg argued, was regulated as any form of life was. This regulation, however, was built into the code.5 This form of regulation he called lex informatica,6 and this lex, he maintained, defines what behavior is possible in cyberspace and what values cyberspace will uphold.7 Whether these are values of anonymity or privacy or free speech or access, it is this law that makes those values possible.

But the lex informatica, he argued, was not a law that was fixed.8 The architectures of cyberspace could be changed. The values that cyberspace embraces could be different. There is no nature to the way that cyberspace is built-no nature, simply code. This code could be made to be very different from what it currently is. It could be made, that is, to embrace a very different set of values.

The other crazy was Pam Samuelson, then a professor at the University of Pittsburgh. Samuelson challenged the second idea-that cyberspace could not be regulated by government. For as Samuelson saw it, the law was already threatening an important regulation of life in cyberspace.9 Not directly, of course, but indirectly-through a series of changes threatened by the Administration's White Paper on Intellectual Property.10 These changes, designed to increase the law's protection for intellectual property, threatened to fundamentally queer the architectures of cyberspace. Laws would have their effect, if only indirectly, by inducing changes in the lex that Reidenberg spoke of.

Time works changes. The views of these two crazies have now become mainstream. Everyone now gets how the architecture of cyberspace is, in effect, a regulator. Everyone now understands that the freedom or control that one knows in cyberspace is a function of its code. Cookies11 mean less privacy; choice about cookies means more privacy. A world without P3P12 is a world with less control over privacy; a world with P3P is a world with more control over privacy. A world with PICS13 is a world where speech is less free; a world without PICS is, well, let's say, nice.14 The differences in these worlds are differences in the code of these worlds. Different code, different regulation, different worlds.

And so too do most now see how government might have a role in this regulation. Smart governments will regulate, but not by directly regulating the behavior of people in cyberspace. Smart governments will instead regulate by regulating the code that regulates the behavior of people in cyberspace. Cyberspace's code will become the target of regulation.15 The future will be littered with examples of government trying to intervene to assure that cyberspace is architected in a way to protect government's interests. Whether those interests will be interests against copyright management circumvention16 or interests in favor of encryption control,17 the government will increasingly see that the most efficient target of regulation is not people but binary code. Enslave the code while telling the world that you are leaving the space free18-this is the formula for taming the liberty that cyberspace now provides.

Two important conclusions follow from the arguments of these two crazies. First, if code is a kind of law, then we should focus, as we do with real-space law, on the freedoms and the constraints built into this code, and on how these freedoms and constraints are changing. And second, if governments regulate code, then we should think about the limits that should constrain government's power to regulate. For our constitutional tradition is one which limits governmental power by limiting government's direct legislative action; yet the future of the government's regulation of the Net is a future where government regulates by indirect legislative action. Constitutional values should constrain both indirect and direct regulation; so far it is not clear that they do.19

III. Limits on Regulability

That's the history: Now something about the future. I want to focus on a new wrinkle in this debate about regulating cyberspace. We are just beginning to understand this new wrinkle, yet it may become the most important question about the future of cyberspace that we have yet seen.

You might think it follows from the commonplace views of our day-from those views once held by the crazies only, but now considered mainstream by most-that government is capable of effectively regulating the Net. If government can regulate the code, then government can require codewriters to build the standards that the government needs into the code. The future of regulatory standards under this view, then, would simply be a future where the government tells codewriters how to architect their code so as to incorporate governmental regulatory standards.

But in fact, the story is interestingly more complicated. In fact, this power of government depends upon a feature of the code-application space code20-that has only recently become salient. This feature is its ownership. Whether government can regulate code depends in part upon who controls that code. If the code is closed-controlled by private for-profit organizations-then government's power is assured. But if the code is open-outside of the control of any particular private for-profit organization-then the government's power is threatened. The more application space code is open code, the less government can regulate that code.

The reason is straightforward. Open code is software in plain view. It is software that comes bundled with its source code as well as its object code. Object code is the code that the computer reads. If you display it on your machine, it will appear as gibberish. But source code is code that programmers can read.21 It is this code that allows a programmer to open an open source software project and see what makes it tick. By being able to see what makes it tick, open source software makes transparent any control that the code might carry. For example, if the code carries a government-mandated encryption routine, that routine will be apparent to open source coders. And because it is apparent, open source coders can then choose whether or not to adopt that portion of an open code project. For by its nature, and by the promises that it comes bundled with in the form of licenses,22 any open code software project remains available for adopters to modify or improve, however the adopters think best.

Closed code functions differently. It does not come bundled with its source, which means that its code is hidden under a hood that won't open.23 Thus adopters or users of closed code cannot as easily detect what makes closed code tick. They can't as easily see whether it carries within it a given encryption routine or systems for collecting private data or technologies for monitoring and reporting usage. Clever adopters can try to work it out through reverse engineering24 or hacking. But no matter how clever the adopter, closed code will be harder to monitor, and harder to change than open code. An adopter of open source code who doesn't like a module can simply substitute another; an adopter of closed code has no equivalently simple choice.25

This difference is critical to the question of regulability. For if the application space is built with closed code, then the ability of adopters to change that code is less than it would be if the application space were comprised of open code. If it is harder for adopters to change code, then it is easier for governments to regulate through that code. Say the government has a standard it wishes to impose on some aspect of the application space. To the extent the regulatory standard gets imposed on closed code, it is more likely to be adopted by users than the same regulation imposed on open code. If the adopters don't like the regulatory standard (which, given the nature of many regulatory standards, is not unlikely), adopters can more easily swap out the regulated code if they use open code than if they use closed code.

An example offered by Peter Harter at this conference makes the point well. Netscape has turned its code for Netscape Communicator over to a version of the open source software movement. Its code is controlled by an organization called Mozilla, but its source is open. When Mozilla releases a new version, adopters around the world are permitted to download the source code, and adopt it or modify it as they wish.26

The French government didn't get this idea. They wanted Netscape to modify the SSL standard27 to enable decryption of SSL transactions, and so they asked Netscape to implement the request. But as Netscape reportedly told the French, there is really very little that Netscape can do to enable the cracking of SSL, and it is easy to see why.28 Even if Netscape built a French version of SSL, enabling the French to spy whenever the French government wants, whether that version got used depends upon whether it is adopted. And even if Netscape put the French SSL version into the code of Netscape Communicator, there is no reason to expect that adopters of the code wouldn't simply substitute a different version of SSL for the French spy-enabled version. Whether the SSL code is adopted is a decision that rests with the users, not with Netscape.

Harter's example is an instance of my more general point: to the extent that code remains open, it is harder for government to regulate; to the extent it is closed, it is easier. Had the French demanded a change in a part of Netscape's code before Netscape had given its code to Mozilla, then it would have been much harder for adopters to identify and disable that code. But after the code is in the commons, governments' power is less. Thus my point: the regulability of the application space turns in part on whether the application space is open.

That's the claim, but it requires some qualifications.

First, my argument turns upon the nature of the "application space" code. This is not the distinction between operating systems and applications, but rather the distinction between the basic Internet protocols and the applications (or "ends") that depend upon or use these protocols. It is the design philosophy of the Net to keep the protocols simple and general, and to build sophistication and complexity into the ends.29 It is possible to imagine the government trying to regulate the Internet's basic protocols. But because these are coordinating standards that effect very little substantive control on the content of the Net, they are unlikely to be the source of any powerful or significant regulation. Regulation, or regulatory standards, if they are to be effective, would have to be embedded in the application space code.

Second, my argument is not that a world with open code, or mostly open code, couldn't be regulated. In my view, there could be relatively small shifts in the architecture of the Net-in the functionality built into the application space-that would fundamentally enable state regulation, even if that application space were open code.30 If the Internet became "certificate rich"-meaning that many people carried and used digital certificates31 while "on" the Net-local government's power to regulate the Net could fundamentally increase, whether or not the basic certificate architectures were open or closed code.

Third, my argument is also not that a world with more closed code is always a world that is more regulable. Some closed code would not affect the Net's regulability. It matters little whether solitaire programs or certain utility programs are open or closed code, for there is little connection between them and any regulation the government might impose. (So long, that is, as they are as they say they are.) Thus the point about regulability is not a point about necessity; it is instead a point about possibility.

And finally, following from the third: my argument is not a criticism of closed code in general. I don't believe that the best possible world is one where all code is open, any more than I believe that the best possible real world is one where all property is public or part of the commons. There is a mix between open and closed spaces in real space and there should be a similar mix between open and closed spaces in cyberspace. The only enemy is the extremes-either a world that was perfectly propertized (either completely, or selectively if selected well), or a world that permitted no closed development. Whatever economic model might support projects like the GNU/Linux OS,32 there is no reason to believe the same model would work for every coding project.33

* * * * *

To many in the open code movement, this whole argument about the values in open source software might seem quite odd. To them, the real issue with open source software is its power. Its real virtue is its amazing efficiency-its robustness and reliability. And no doubt, if these are its virtues, they are valuable indeed.

But my point is not to question any claim about efficiency. My point is simply that there are other issues at stake as well.34 The architecture of cyberspace embeds a set of values, as it embeds or constitutes the possible. But beyond the values built into this architecture, there are values that are implicated by the ownership of code. Its ownership can enable a kind of check on government's power-a separation of powers that checks the extent that government can reach. Just as our Constitution embeds the values of the Bill of Rights while also embedding the protections of separation of powers,35 so too should we think about the values that cyberspace embeds, as well as its structure.

However efficient open code may be, arguments about open source must also consider the questions that these values raise. For in my view, it makes as much sense to promote open source on efficiency grounds alone as it does to promote democracy on grounds of economic wealth alone. It may well be that democracies are more wealthy than other forms of government, just as it may well be that open source software is more robust than others. But it is a thin conception of value that would see wealth or efficiency as the only, or most important, value at stake.