© 1999 Michael Lee, Sean Pak, Tae Kim, David Lee, Aaron Schapiro, and Tamer Francis.

Michael Lee, Sean Pak, Tae Kim, Aaron Schapiro, and Tamer Francis are third-year law students at Harvard. David Lee is an associate with the law firm of Shearman & Sterling.

This comment was the second-place winner of the 1998 Berkeley Technology Law Journal Comment Competition.

1. Chris O'Malley, Information Warriors of the 609th (The Air Force's 609th Information Warfare Squadron), POPULAR SCIENCE, July 1997, at 74.

2. The Mentor, The Mentor's Last Words (visited Apr. 16, 1999) <http//:insane.bloodline.com/mentor.html>.

3. For example, the theory of friction-free markets once posited that Bertrand competition would necessitate pure price competition on the Internet, such that Internet markets would have lower prices than real space markets. See Joseph Bailey & Erik Brynjolfsson, In Search of "Friction-Free Markets": An Exploratory Analysis of Prices for Books, CDs and Software Sold on the Internet, at 3-5 (1998) (unpublished manuscript) (on file with authors).

4. Although Internet-based commerce is the most visible form of electronic commerce, the former is clearly a subset of the latter. As used in this paper, the term "electronic commerce" encompasses all commercial transactions involving the exchange of "bits" as opposed to "atoms."

5. See Commerce Department to Measure Online Sales' Impact (visited Feb. 5, 1999) <http://www.internetnews.com/ec-news/article/0,1087,archive_4_65111,00.html>. There exists a wide variation in estimates of online shopping due to differences in terminology and methodology. See Maryann Jones Thompson, Spotlight: Why E-commerce Forecasters Don't Get It "Right," THE INDUSTRY STANDARD, Mar. 1, 1999, available at <http://www.thestandard.com/metrics/display/0,1283,850,00.html>.

6. See Thompson, supra note 5.

7. See Peter J. Denning, Electronic Commerce, in INTERNET BESIEGED: COUNTERING CYBERSPACE SCOFFLAWS 385-86 (Dorothy E. Denning and Peter J. Denning eds., 1997). Denning argues that cyberspace "trust" would be less difficult to establish with reliable authentication technology, i.e., that current cyberspace code precludes a social norm of "trust." According to Denning, "If human coordination, rather than information exchange, had been at the center of attention of protocol designers, it would be exceedingly difficult today to spoof an e-mail or Internet address or to forge a signature on a document." Id. Indeed, building trust online is the focus of many Internet-related companies and consultancies. See Maryann Jones Thompson, E-commerce Spotlight: Building Trust Online, THE INDUSTRY STANDARD, Jan. 25, 1999, available at <http://www.thestandard.com/metrics/display/0,1283,829,00.html>.

8. Critics have argued that electronic commerce on the Internet reduces overall transaction costs (e.g. search costs, negotiation, and delivery costs) and facilitates connectivity so as to eliminate considerations of real space time and distance. See, e.g., Denning, supra, note 7, at 377-78.

9. According to a 1997 Robertson & Co. report, the total number of U.S. Internet users is expected to reach 102 million by the year 2000. See ComputerWorld, Commerce by Numbers (visited Apr. 9, 1999) <http://www.computerworld.com/home/Emmerce.nsf/All/pop>.

10. In a 1999 national survey conducted by Netzero, more than 53 percent of the respondents cites "privacy and security" as their biggest concerns regarding online shopping. See Beth Cox, Security, Privacy Remain Top Consumer Concerns, (visited Apr. 9, 1999) <http://www.internetnews.com/ec-news/article/0,1087,4_95031,00.html>.

11. See Greta Mittner, E-commerce Companies Rejoice, RED HERRING, Jan. 4, 1999, available at <http://www.redherring.com/insider/1999/0104/news-shopping.html>.

12. See discussion infra Part I.B.

13. See discussion infra Part I.

14. These critics can be characterized either as optimists or pessimists, depending on how one views the broader implications of the advent of electronic commerce. See discussion infra Part V.C.

15. See, e.g., David G. Post, Anarchy, State and the Internet: An Essay on Law-Making in Cyberspace, 1995 J. ONLINE L. ART. 3 (visited Jan. 20, 1998), available at <http://www.law.cornell.edu/jol/post.html>. See also David R. Johnson & David Post, Law and Borders-The Rise of Law in Cyberspace, 48 STAN. L. REV. 1367 (1996).

16. For example, Jeffrey Schiller, a computer security expert at M.I.T., claims that encryption technology such as PGP ("Pretty Good Privacy") can provide security against most hacking attacks and that "at this early stage, the insecurity of the Internet is primarily a result of human error and lack of user security education initiatives." Catherine Therese Clarke, From CrimINet to Cyber-Perp: Toward an Inclusive Approach to Policing the Evolving Criminal Mens Rea on the Internet, 75 OR. L. REV. 191, 231-32 (1996) (internal quotations omitted).

17. According to Lessig:

We live life subject to the code [in cyberspace], as we live life subject to nature. Just as we do not choose whether to see through a wall or not, we don't choose whether to enter America Online without giving our password. Superman might choose whether to see through a wall; and hackers might be able to choose whether to enter AOL with a password. But we are neither supermen or hackers (if such a distinction exists). We live life subject to the constraints of the code; however (and by whomever) these constraints have been set.

Lawrence Lessig, The Constitution of Code: Limitations on Choice-Based Critiques of Cyberspace Regulation, 5 COMMLAW CONSPECTUS 181, 184 (1997) [hereinafter Constitution of Code]. In defense of his claim that code-based solutions for regulating cyberspace are effective despite hacking, Lessig has further stated:

But from the fact that 'hackers could break any security system,' it no more follows that security systems are irrelevant than it follows from the fact that 'a locksmith can pick any lock' that locks are irrelevant. Locks, like security systems on computers, will be quite effective, even if there are norm-oblivious sorts who can break them.

Lawrence Lessig, Reading the Constitution in Cyberspace, 45 EMORY L.J. 869, 896 n.80 (1996) [hereinafter Constitution in Cyberspace]. Admittedly, Lessig does not claim that hackers do not pose any threat to electronic commerce. Rather, his discussion of hackers is limited to their effect on the long-term architectural development of the Internet, apart from their role in electronic commerce.

18. See David L. Gripman, The Doors are Locked but the Thieves and Vandals are Still Getting In: A Proposal in Tort to Alleviate Corporate America's Cyber-Crime Problem, 16 J. MARSHALL J. COMPUTER & INFO. L. 167, 169-70 (1997).

19. See Marc D. Goodman, Why the Police Don't Care About Computer Crime, 10 HARV. J.L. & TECH. 465, 472 (1997).

20. 928 F.2d 504, 505-07 (2d Cir. 1991).

21. See Gripman, supra note 18, at 171.

22. See Marc S. Friedman & Kristin Bissinger, Infojacking: Crimes on the Information Superhighway, 9 No. 5 J. PROPRIETARY RTS. 2, 7 (1997).

23. See Goodman, supra note 19, at 472.

24. See Friedman & Bissinger, supra note 22, at 7.

25. Id.

26. See id. at 2.

27. Id.

28. See id. at 7.

29. See id. at 10.

30. Gripman, supra note 18, at 173.

31. Dorothy E. Denning & Peter J. Denning, Preface to INTERNET BESIEGED: COUNTERING CYBERSPACE SCOFFLAWS at vii (Dorothy E. Denning & Peter J. Denning eds., 1997).

32. See id.

33. For example, after cracking a password, a malicious hacker might pose as a legitimate user, browsing through files to gain confidential and financial information. If root access is acquired, the hacker may also leave a destructive logic bomb or alter login records to conceal his tracks. See Dorothy E. Denning, Cyberspace Attacks and Countermeasures, in INTERNET BESIEGED: COUNTERING CYBERSPACE SCOFFLAWS 29, 32 (Dorothy E. Denning & Peter J. Denning eds., 1997).

34. Id.

35. This estimate is likely to be very conservative. See id.

36. Upon installation, a packet sniffer places the /dev/nit interface (a widely installed network utility tool) into "promiscuous mode" and logs the first 128 bytes of all TCP (i.e. Internet) sessions being routed through the compromised host machine. The hacker then periodically accesses the host machine to collect the intercepted information. For a more detailed description of packet sniffers, see E. Eugene Schultz & Thomas A. Longstaff, Internet Sniffer Attacks, PROCEEDINGS OF THE NATIONAL INFORMATION SYSTEMS SECURITY CONFERENCE 534-541 (Oct. 1995).

37. See id. at 141.

38. As noted earlier, a hacker can combine packet sniffing and snooping attacks to infiltrate a large number of secured sites.

39. In 1996, two hackers were convicted of downloading 1,700 credit card numbers from a Tower Records computer system that they had infiltrated. See Dorothy E. Denning, supra note 33, at 33.

40. See id. at 33-34.

41. Root access enables the hacker to modify system files and programs and to access personal files of every user on the system.

42. See Dorothy E. Denning, supra note 33, at 33-34.

43. See id.

44. Edward W. Felten et al., Web Spoofing: An Internet Con Game (last modified Feb. 1997) <http://www.cs.princeton.edu/sip/pub/spoofing.html>.

45. See Dorothy E. Denning, supra note 33, at 35.

46. See id. at 36.

47. See id. at 37-38.

48. A highly publicized example is the Internet Worm program released by Robert Morris. For a detailed account of the Worm program, see KATIE HAFNER & JOHN MARKOFF, CYBERPUNK: OUTLAWS AND HACKERS ON THE COMPUTER FRONTIER 280-81 (1991).

49. See Michael McCormack, Europe Hit by Cryptoviral Extortion, COMPUTER FRAUD & SECURITY BULLETIN, June 1, 1996, at 3.

50. "Cracking" a password or encryption key (i.e., finding or guessing) should be distinguished from "cracking" a software application (i.e., disabling protection features). See A. Michael Froomkin, The Metaphor is the Key: Cryptography, the Clipper Chip, and the Constitution, 143 U. PA. L. REV. 709 (1995).

51. A real-life example is the Network File Service ("NFS") and sendmail programs for the UNIX operating system, both of which originally contained bugs allowing regular users (and hackers posing as users) to obtain root access. See Dorothy E. Denning, supra note 33, at 38-39.

52. Id. at 38.

53. See discussion infra Part II.F.

54. See Dorothy E. Denning, supra note 33, at 39.

55. See id. at 41.

56. Cryptographic algorithms can be used for two distinct purposes: secrecy and authenticity. The term "encryption" is generally used to refer to cryptographic systems used only for secrecy. See id.

57. Typically, an encryption DES system is implemented by requiring a different session key for each communication and providing a different long-term key used for authenticating the user and for distributing session keys.

58. As noted in the following subsection on authentication, public key systems can be used for authentication as well as encryption purposes.

59. See Froomkin, supra note 50, at 752.

60. In 1996, a 130-digit RSA key was cracked. RSA Laboratories recommends that keys be at least 230 digits (or more than 768 bits). In June 1997, a 56-bit DEC key was broken after four months of trial and error. According to cryptography experts, the DES algorithm is nearing the end of its useful lifetime. See Dorothy E. Denning, Encryption Policy and Market Trends, in INTERNET BESIEGED: COUNTERING CYBERSPACE SCOFFLAWS 458 (Dorothy E. Denning & Peter J. Denning eds., 1997).

61. See id. at 457-60.

62. See Froomkin, supra note 50, at 711.

63. See id. at 711-17.

64. See id. at 717-51.

65. An example of an RSA-based authentication scheme is the Pretty Good Privacy ("PGP") developed of Phil Zimmerman of MIT. For a more detailed analysis of public-key encryption systems, see Thomas Y.C. Woo and Simon S. Lam, Authentication for Distributed Systems, in INTERNET BESIEGED: COUNTERING CYBERSPACE SCOFFLAWS at 319-56 (Dorothy E. Denning & Peter J. Denning eds., 1997).

66. The authenticity of the public key can be guaranteed by a trusted third party (e.g., a certification authority or a member of "a web of trust").

67. For a detailed description of monitoring systems, see Dorothy E. Denning, supra note 33, at 45-47.

68. For an in-depth analysis of intrusion detection systems, see Aurobindo Sundaram, An Introduction to Intrusion Detection (visited Apr. 16, 1999) <http://www.cs.purdue.edu/coast/archive/data/author3.html>.

69. For a discussion of virus scanners and disinfectors, see Dorothy E. Denning, supra note 33, at 48-49.

70. See id. at 49.

71. See id. at 49-50.

72. Andy Grove, ONLY THE PARANOID SURVIVE (1996).

73. A useful analogy is an underground water reservoir with vertical pipelines. Applying pressure on one of the pipelines will cause water levels to rise in the remaining pipelines. Similarly, applying regulatory pressure on hacking activities may cause incidents of related activities (e.g., cracking, phreaking, and social engineering) to rise.

74. See +ORC ("the old red cracker"), How to Crack, A Tutorial-Lesson 1 (visited Mar. 13, 1999) <http://www.geocities.com/Athens/Agora/1948/Crack/howto1.txt>. The old red cracker, a hacker, has authored one of the many "how-to" manuals on hacking available on the Internet. See infra note 76.

75. See id. (defining cracking as "understanding, broadly individuating, locating exactly and eliminating or suspending or deferring one or more protection schemes inside a software application you do not possess the source code of").

76. "How-to" manuals vary in quality and accessibility. An example of a well-written and widely-read manual is the How to Crack, A Tutorial, supra note , written by "the old red cracker." This manual gives step-by-step instructions on how to crack various types of software applications, including those written for the Windows operating system.

77. See, e.g., Kurupt Technologies, Kurupt Warez (visited Apr. 16, 1999) <http://www2.ipeg.com/>.

78. The provision of cracker utilities and serial numbers that are intended to circumvent the copyright protections in software, when used by a direct infringer, may constitute contributory infringement under copyright law. See Software Publishers Association Policy Statement on Contributory Infringement (visited Feb. 5, 1999) <http://www.spa.org/piracy/contrib.htm>.

79. Jim Christy, Rome Laboratory Attacks: Prepared Testimony of Jim Christy, Air Force Investigator, before the Senate Governmental Affairs Committee, Permanent Investigations Subcommittee, May 22, 1996, in INTERNET BESIEGED: COUNTERING CYBERSPACE SCOFFLAWS 64 (Dorothy E. Denning & Peter J. Denning eds., 1997).

80. For a more detailed listing of various phreaking devices, see Voyager, #hack Frequently Asked Questions (FAQ) (visited Feb. 5, 1999) <ftp://rtfm.mit.edu/pub/usenet-by-group/alt.2600/alt.2600_FAQ>.

81. See Christy, supra note 79, at 57-65.

82. See bernz, The Complete Social Engineering FAQ 1.1 (visited Feb. 4, 1999) <http://members.tripod.com/~bernz/socenfaq.txt>.

83. See id.

84. See id.

85. For a real-life account of social engineering, see The New York Newsday Interview with Ice Man and Maniac: Inside the Underworld of "Hacking," N.Y. NEWSDAY, July 22, 1992, at 83.

86. For instance, the most sophisticated password system can be circumvented by deceiving one of its users to disclose his or her password unwittingly.

87. The following discussion on the history of password security systems is based on Robert Morris & Ken Thompson, Password Security: A Case History (visited Jan. 21, 1998) <http://www.securezone.com/Information_Sources/Papers/>.

88. See id.

89. See id.

90. The problem with the original M-209 scheme was that, with a given key, encrypted messages (or "ciphers") were trivial to invert. It was much more difficult to reverse engineer the key given the cleartext input and the encrypted messages. Thus, the UNIX designers decided to use the password not as the text to be encrypted, but as the key to encrypt a predetermined constant. The encrypted result was then stored in the password file.

91. Some "profitable" entries to include as trial passwords are: (1) the dictionary with the words spelled backwards; (2) a list of first names, last names, and street names; (3) all valid license plate numbers; (4) social security and telephone numbers.

92. Morris & Thompson, supra note 87, at 5.

93. See discussion supra Part II.C.1.

94. For a detailed discussion of the Java programming language and executable content in general, see Joseph A. Bank, Java Security (Dec. 8, 1995) <http://swissnet.ai.mit.edu/~jbank/javapaper/javapaper.html>.

95. See id.

96. See id.

97. See id.

98. See id.

99. See, e.g., id.; Gary McGraw & Edward Felten, Understanding the Keys to Java Security-the Sandbox and Authentication, JAVA WORLD, May 1997, available at <http://www.javaworld.com/javaworld/jw-05-1997/jw-05-security.html>; Drew Dean et al., Java Security: Web Browsers and Beyond, in INTERNET BESIEGED: COUNTERING CYBERSPACE SCOFFLAWS 241-71 (Dorothy E. Denning & Peter J. Denning eds., 1997).

100. See Bank, supra note 94.

101. See id.

102. See id.

103. See id.

104. See Felten et al., supra note 44.

105. See Bank, supra note 94, at 10.

106. See Felten et al., supra note 44.

107. For a detailed description of Web spoofing attacks, see id.

108. See id.

109. See id.

110. Although the hackers had employed the newly discovered attacks to hack their way through firewalls in January of 1997, they had decided to give Netscape and Microsoft ample time to address the problem before they publicly disclosed their methods. See Gary McGraw, Is Your Browser a Blabbermouth? Are Your Ports Being Scanned?, JAVA WORLD, Mar. 1997, available at <http://www.javaworld.com/javaworld/jw-03-1997/jw-03-securityholes.html>.

111. See id.

112. See id.

113. See id.

114. Dorothy E. Denning & Peter Denning, Preface, supra note 31, at x-xi (emphasis added).

115. See generally Dorothy E. Denning, Concerning Hackers Who Break into Computer Systems, at 13 (visited Jan. 23, 1998) <http://www.cpsr.org>.

116. Id. at 1.

117. Dorothy Denning, in her 1990 survey of the hacking community, stated that, according to all of the hackers she spoke with, malicious hacking was considered morally wrong. They also said that most hackers were not intentionally malicious, and that they were concerned about causing accidental damage. See id. at 10.

118. In A Novice's Guide to Hacking, the "Mentor," one of the members of the Legion of Doom hacking group, presents the following set of guidelines for beginning hackers:

Do not intentionally damage any system.

Do not alter any system files other than ones needed to ensure your escape from detection and your future access.

Do not leave your real name (or anyone else's) real name, real handle, or real phone number on any system that you access illegally.

Be careful who you share information with.

Do not leave your real phone number to anyone you don't know.

Do not hack government computers.

Don't use codes unless there is no way around it.

Don't be afraid to be paranoid.

Watch what you post on boards.

Don't be afraid to ask questions.

Finally, you have to actually hack.

A Novice's Guide to Hacking-1989 Edition (visited Apr. 16, 1999) <http://insane.bloodline.com/mentor.html>.

119. Id. at 5.

120. Id. at 5.

121. Id. at 10.

122. See id. at 10-11.

123. See id. at 11. But see Eugene H. Spafford, Some Musings on Ethics and Computer Break-ins (visited Jan. 19, 1998) <http://www.cs.purdue.edu>.

124. See The New York Newsday Interview with Ice Man and Maniac: Inside the Underworld of "Hacking," supra note . In an interview with a Newsday reporter, Joshua Quittner, a well-known hacker by the pseudonym of "Maniac" stated: "[Hacking] is an organized hobby. You do these things for us and you get a little recognition for it." Id.

125. See discussion supra Part II.D.2.

126. See Benjamin J. Fox, Hackers and the U.S. Secret Service (visited Jan. 20, 1998) <http://www.gse.ucla.edu//iclp/bfox.html>.

127. Id.

128. See discussion supra Part I.

129. See Fox, supra note 126.

130. 18 U.S.C. 1030(a)(4) (1998).

131. 18 U.S.C. 1030(e)(2) (1998).

132. See id.

133. See 18 U.S.C. 1030(a)(1) (1998). To be prosecuted under § 1030(a)(1), the actor must have reason to believe that such information will be used to the injury of the United States or to the advantage of any foreign nation. Further, the section is violated regardless of whether the actor communicates the information to another person or simply retains it. This crime is treated as a felony.

134. 18 U.S.C. 1030(a)(2) (1998). A "financial record" is defined as "information derived from any record held by a financial institution pertaining to a customer's relationship with the financial institution." 18 U.S.C. 1030(e)(5) (1998). Under this section, obtaining information of minimal value ($5,000 or less) results in a misdemeanor, whereas obtaining valuable (more than $5,000) information or misusing information for financial or commercial gain or to commit a criminal or tortious act constitutes a felony.

135. 18 U.S.C. 1030 (a)(3) (1998). Section 1030(a)(3) criminalizes electronic trespasses on Federal Government computers. If the computer is not exclusively used by the Government, a violation is found if the trespasser's conduct affects the use of the computer by the Government.

136. 18 U.S.C. 1030(a)(4) (1998). This section contains a "computer use" exception where the intent to defraud consists only in making use of the computer.

137. 18 U.S.C. 1030(a)(5) (1998). Section 1030(a)(5) contains three provisions covering both outsider hackers and insiders who cause intentional, reckless or negligent damage. Violating the first two provisions is a felony, violating the third provision is a misdemeanor, with penalties based on the intent and authority of the actor.

The first provision prohibits unauthorized access to a protected computer where the actor knowingly transmits any program, information, code, or command which intentionally causes damage, covering both insiders and outsiders. The second provision prohibits unauthorized, intentional access to a protected computer, where such trespass recklessly causes damage, covering only outside hackers. The third provision prohibits the same action, but where such trespass causes damage, covering outside hackers. See S. Rep. No. 104-357, at 7-8 (1996).

Thus, insiders authorized access to a protected computer face criminal liability only for causing intentional damage, whereas outside hackers who break into a computer can be held liable for intentional, reckless, or negligent damage. This distinction between outsiders and insiders stems from the doctrine of trespass:

To provide otherwise is to openly invite hackers to break into computer systems, safe in the knowledge that no matter how much damage they cause, it is no crime unless that damage was either intentional or reckless. Rather than send such a dangerous message (and deny victims any relief), it is better to ensure that 1030(a)(5) criminalizes all computer trespass, as well as intentional damage by insiders, albeit at different levels of severity.

Id.

The term "damage" is broadly defined to include any impairment to the integrity or availability of data, a program, a system, or information that (A) causes loss aggregating at least $5,000 in any one-year period to one or more individuals; (B) either modifies or impairs, or potentially modifies or impairs, the medical examination, diagnosis, treatment, or care of one or more individuals; (C) causes physical injury to any person; or (D) threatens public health or safety. See S. Rep. No. 104-357, at 8 (1996).

However, it is unclear whether there is a loss if, for example, a virus does not destroy files, but simply overloads the network, thus slowing down processing speed or using up some of a system's underutilized capacity. What is clear is that this section was added to address the threat posed by hackers. See S. Rep. No. 104-357, at 9 (1996) (describing 1030(a)(5) as a measure that protects computers from hackers).

138. 18 U.S.C. 1030(a)(6) (1998).

139. 18 U.S.C. 1030(a)(7) (1998).

140. See S. Rep. No. 104-357, at 10-11(discussing changes in mens rea level).

141. See S. Rep. No. 104-357, at 9-12 (1996) (discussing effect of different mens rea requirements and intended effect from using different mens rea).

142. See id. at 10 (indicating Congress's desire to punish hackers who unintentionally cause damage to computer systems).

143. 928 F.2d 506 (2nd Cir. 1991).

144. Harold L. Burstyn, Computer Whiz Guilty, 76 A.B.A. J. 20, 20 (1990).

145. 92 F.3d 865, 865 (9th Cir. 1996).

146. For an insightful critique of current law enforcement along these lines, see Catherine Therese Clarke, From CrimINet to Cyber-Perp: Toward an Inclusive Approach to Policing the Evolving Criminal Mens Rea on the Internet, 75 OR. L. REV. 191 (1996).

147. See discussion infra Part IV.B. Moreover, the CFAA does not provide an incentive for anyone to adopt adequate anti-hacking security measures. In fact, network security remains at an shockingly low level and is virtually nonexistent in many companies despite the severity of the hacking threat. A 1996 survey revealed that 58 percent of companies do not have a written policy on how to deal with network intrusions. See Gripman, supra note 18, at 174 n.21. This lack of security obviously facilitates Internet hacking. According to security expert Clifford Stoll, "The security weaknesses of both systems and networks, particularly the needless vulnerability due to sloppy systems management and administration, result in a surprising success rate for unsophisticated attacks." Id. at 177. This is not to say, of course, that allocative inefficiency or cost externalization is in and of itself sufficient justification for cyberspace regulation. See discussion infra Part V.C.

148. See Lessig, Constitution of Code, supra note 17, for a detailed discussion of Lessig's theory of indirect regulation through code as the most effective means of regulation in cyberspace.

149. See id. at 184.

150. See Llewellyn Joseph Gibbons, No Regulation, Government Regulation, or Self-Regulation: Social Enforcement or Social Contracting for Governance in Cyberspace, 6 CORNELL J.L. & PUB. POL'Y 475, 489 (1997).

151. Some proposals have suggested piecemeal reforms to existing legislation. See, e.g., Clarke, supra note 146. Catherine Clarke has proposed a scheme for law enforcement on the Internet that employs the technical expertise of hackers to improve Internet security while promoting self-regulation of the Internet through code solutions such as PGP. Although Clarke recognizes the importance of tailoring law enforcement techniques to match more closely available demographic data on the different subsets of the hacking community, implementing the proposals set forth thereafter are difficult to envision under the current legal regime. For instance, there is no reason to believe that convicted ex-hackers will serve as effective community educators as she suggests, particularly since the social divide between hackers and the rest of the Internet community is imposed by the law itself, irrespective of how such law is enforced. Moreover, as Clarke concedes, "cultural barriers exist between young hackers ... and police officers. Law enforcement officers may be hesitant to seek out the advice of persons who could be their teenage children. The Generation-X young men ... may also be unenthusiastic about assisting law enforcement agencies." Clarke, supra note , at 233. Clarke must ultimately reduce her claim to the proposition that "existing institutional and procedural measures may force some level of cooperation." Id. Re-examination of the laws creating these harmful social norms (i.e., the social divide) suggest that existing institutional and procedural measures should be jettisoned altogether.

152. See, e.g., Gripman, supra note 18.

153. Gibbons, supra note 150, at 509.

154. Gripman, supra note 18, at 170 n.14.

155. See id. at 175. Gripman suggests imposing tort liability on corporations for injuries incurred by third parties as a result of hackers' using the corporations' networks to hack into third parties' computers. As explained below, however, such an approach would raise the cost of online participation for corporations, thereby deterring many companies-particularly small ones-from going online.

156. See id. at 176.

157. See id. Given the difficulties associated with identifying the perpetrators of tortious hacking, the primary goal of the model of tort law proposed here is not the deterrence of socially undesirable activity (i.e., hacking), which tort law is traditionally concerned with, but rather the growth of the Internet as facilitated by greater network security.

158. See id.

159. See id. at 176 (quoting John W. Wade, et al., PROSSER, WADE AND SCHWARTZ'S TORTS 1 (9th ed. 1994)).

160. Corporations take only individual, not social, costs and benefits into account when they make business decisions. However, online participation has strong positive externalities due to such phenomena as network effects that augment the utility of other users. Thus, the social benefit of an individual corporation's online participation exceeds its individual benefit, and should therefore be encouraged. Hacking imposes a cost on online companies; compensation via tort liability reduces this cost, thereby raising the expected net benefit (benefit less cost) of going online. Thus, the tort system can raise online participation to the socially-optimal level by transferring a portion of the expected cost of going online-i.e., costs imposed by hackers-from corporations to ISPs.

161. See Victoria A. Cundiff, Trade Secrets and the Internet: A Practical Perspective, C OMPUTER LAW., Aug. 1997 at 6, 14 ("Internet tortfeasors and infringers are likely to include a high percentage of students and others who may not have the resources to satisfy large judgments.").

162. ISPs may also be judgment proof in some instances. This problem could be solved by requiring ISPs to maintain a minimum level of assets.

163. See Ian C. Ballon, The Law of the Internet: Developing a Framework for Making New Law, 482 PLI/PAT 9, 20-21 (1997).

164. One might argue that a corporation's knowing placement of confidential information in a database accessible on the Internet constitutes an effective assumption of risk that would vitiate third party tort liability. However, unlike other risky activities (e.g., skiing), online activities have positive externalities and should be encouraged, given the network effects of online participation and the efficiency of electronic commerce. Tort liability imposed on ISPs largely removes such risk from corporations' net benefit calculus and therefore increases their expected net benefit from online participation, thereby increasing total expected online participation.

165. Strict liability is another regime that could be possibly erected to deal with the hacking problem. Applying strict liability to ISPs for all damages incurred as a result of hacking has its advantages, given that (1) ISPs are the party in the best position to detect and eliminate defects in security, (2) ISPs are best able to absorb and spread the risk or cost of injuries through insurance or price increases, and (3) the strict liability rule avoids costly and burdensome requirements of proof. However, the problem with such an approach is that it limits online corporations' incentives to establish security systems of their own that exceed the security levels imposed on ISPs by a due care standard, since corporations would be compensated for all losses regardless of whether the ISP maintained the level of due care or not. Under the negligence rule, this would not be a problem. If a corporation felt that the level of due care was too low for its purposes (say, because it had unusually highly sensitive and valuable information exposed), it would have an incentive to erect higher security levels than those required under due care, since the corporation would not be compensated for losses if the ISP maintained the level of security mandated under the due care standard.

166. See Gripman, supra note 18, at 179.

167. See id. at 171-77.

168. See id. at 184-91.

169. William A. Hodkowski, The Future of Internet Security: How New Technologies Will Shape the Internet and Affect the Law, 13 SANTA CLARA COMPUTER & HIGH TECH. L.J. 217, 220 (1997).

170. See generally Lawrence Lessig, Constitution of Code, supra note 17.

171. Id. at 184.

172. See discussion supra Part II. The proposition that hackers will evade architectural constraints and therefore pose a threat to electronic commerce is distinct from the claim that hackers point to the general inefficacy of code-based solutions in cyberspace, an argument which is not made here.

173. See Lessig, Constitution in Cyberspace, supra note 17, at 869.

174. Lawrence Lessig, The Zones of Cyberspace, 48 STAN. L. REV. 1403, 1411 (1996) [hereinafter Zones].

175. See Lawrence Lessig, Constitution of Code, supra note 17. Although Lessig makes explicit reference only to code, law, and social norms, he does not claim "that there are no other constraints. Psychology or the market, for example, are constraints which are related to these three primary constraints in complex ways." Id. at 181 n.1. Explicit mention of market forces above is consistent with Lessig's inclusion of the market as a primary constraint in his more recent lectures in his course The High Tech Entrepreneur.

176. Id. at 183-84 (footnotes omitted).

177. See id. at 184.

[G]overnment will shift to a different regulatory technique. Rather than regulating behavior directly, government will regulate indirectly. Rather than making rules that apply to constrain individuals directly, government will make rules that require a change in code, so that code regulates differently. Code will become the government's tool. Law will regulate code, so that code constrains as government wants.

Id.

178. Lessig, Zones, supra note 174, at 408 n.18. Lessig's contention that indirect regulation through code is the most effective regulator in cyberspace in no way competes with the contention that such code is a poor means for regulating hackers. Lessig's fear is that cyberspace code will develop in undesirable ways despite the existence of hackers, not as a consequence of eliminating hackers. ("I don't think one need believe hacking impossible to believe it will become less and less significant. People escaped from concentration camps, but that hardly undermines the significance of the evil in concentration camps."). Id.

179. Lessig, Constitution of Code, supra note 17, at 1411.

180. Initially, these ethics reflected the values of Internet architects. This is certainly not the case today. See discussion supra Part IV.B.

181. Lessig, Constitution of Code, supra note 17, at 184.

182. See Lessig, Constitution in Cyberspace, supra note 173, at 901.

183. Lessig, Zones, supra note 174, at 1410.

184. Id. at 1408.

185. Lessig, Constitution of Code, supra note 17, at 184.

186. See Lessig, Zones, supra note 174, at 1410.

187. Lessig, Constitution in Cyberspace, supra note 173, at 909.

188. See discussion supra Part IV.A.

189. Hackers felt that system managers treat them like enemies and criminals, rather than as potential helpers in their task of making their systems secure. See Dorothy E. Denning, Concerning Hackers Who Break into Computer Systems (visited Apr. 24, 1999) <http://www.cpsr.org/cpsr/privacy/crime/denning.hackers.html>.

190. "Frank Drake," an editor of the now defunct cyberpunk W.O.R.M., suggested in 1990 that making a legal distinction between malicious and non-malicious hacking would lead to a "kinder, gentler" relationship between hackers and computer security people. See id. at 16.

191. According to Dorothy Denning in her 1990 survey, several hackers said that they would like to be able to pursue their activities legally and for income: "Hackers say they want to help system managers make their systems more secure. They would like managers to recognize and use their knowledge about design flaws and the outsider threat problem." Also, the hackers felt that it would help if system managers and the operators of phone companies and switches could cooperate in tracing a hacker without bringing in law enforcement authorities. See id. at 15.

192. As the following footnotes will illustrate, some companies are turning to market-based initiatives already. With the decriminalization of non-malicious hackers, more and more companies will feel comfortable with trusting hackers and relying on them for their expertise.

193. For example, consider Crypto-Logic. This company has developed a new type of encryption software for sending secure e-mail messages. It is currently staging a contest in which it challenges hackers to decode an encrypted message sitting on its Web site. See Ultimate Privacy (visited Feb. 8, 1999) <http://www.ultimateprivacy.com>.

194. For instance, in the famous case of Rome Laboratory Attacks, the Government was able to identify one of the hackers through an intelligent network of informants after failed attempts to trace back the origin of attack using phone taps and packet tracing tools. See Christy, supra note 79, at 59-60.

195. Although the information security community is in principle reluctant to hire hackers to work for them, some will admit to hiring, or at least consulting with, ex-hackers. Among them are the National Computer Crime Information Center, part of the Federal Bureau of Investigation, and the operator of the system that is hacking's Holy Grail: the National Security Agency ("NSA"). A highly regarded Information Services security consultant confirmed that both institutions, along with several major defense contractors, have occasionally used hackers at least as informants in the past.

In another instance, Price Waterhouse's elite group of computer experts-the Tiger Team-spends its waking hours breaking into their client's security systems. The team, part of the firm's Enterprise Security Solutions Practice, simulates "enemy" break-ins to help clients defend themselves against computer hackers.

196. See discussion supra Part I.

197. In fact, many hackers are members of consumer advocate and civil liberties organizations such as Electronic Frontier Foundation ("EFF"), the League for Programming Freedom ("LPF"), and SotMesc.