By Eric Goldman, Forbes
It’s only two hours between Sacramento, California’s state capitol, and the Silicon Valley, the world’s technology capital, but when it comes to regulating the Internet, philosophically they are worlds apart. The two worlds collided in 2013 when Sacramento enacted a fleet of new Internet privacy laws telling Silicon Valley Internet companies how to conduct their affairs (see my posts 1,2, 3, 4). The breathtaking sweep of Sacramento’s regulatory efforts made it clear that the two communities have a lot to talk about. To facilitate that conversation, in December, Santa Clara University hosted an informational hearing of three California Assembly committees. The ensuing discussion raised a host of questions about Sacramento’s role in regulating the Internet for Californians—and the rest of the world.
[Note: These notes aren’t a comprehensive recap of the day, and I mostly summarized a speaker’s remarks rather than quote them. As a result, please don’t attribute these remarks to the speakers without checking the video recordings from the day.]
Prof. Paul Schwartz from UC Berkeley started the day by touting the “California Effect,” California’s ability to adopt policy innovations that propagate through the country and the world. He cited the (notorious, in my opinion) example of data breach notification laws, which started in California in 2002 and have propagated into 45 other states, federal health law and European law. He said that historically federal lawmakers would respond to California’s policy innovations through a productive dialogue (what he called “flight to DC” and “defensive preemption”). [My comment: although Paul and some other Californians extol this propagation of regulatory ideas originating in California, it contributes to the perception among many non-Californians that California is crazy and Congress must clean up California’s policy messes].
Paul explained that the state-federal dialogue has broken down because the current Congress is gridlocked. Meanwhile, California continues enacting “a tidal wave of California privacy laws.” He cautioned against waiting for a “federal Godot,” i.e., expecting Congress to reengage productively on privacy regulation.
What should California do in light of this broken dialogue between federal and state legislatures? He thinks California should keep regulating because California can make a significant contribution to the international dialogue. However, California should be extra cautious given the likely lack of a federal response.
Thus, Paul advocated that the California legislature consolidate its efforts and amend existing statutes. Fall 2013 was a busy time for the legislature, so the legislature should take stock of what it’s done and learn lessons from its successes and failures. He called attention to two laws that deserve legislative attention: (1) the Song Beverly Credit Card Act, which inhibits retailers from taking reasonable steps to prevent fraud and identity theft (see my recent post criticizing Song-Beverly), and (2) the Confidentiality of Medical Information Act, which has been eclipsed by federal legislative developments.
He also wants to make it easier for people to electronically access materials about California’s privacy laws, such as legislative history. This would help interested parties across the globe better understand what’s going on in California.
[Prof. Schwartz posted a version of his testimony.]
…
Prof. Deirdre Mulligan of UC Berkeley, like Prof. Schwartz, praised California’s long reputation for privacy leadership. She said regulators outside California look to California as a laboratory of experimentation, and those experiments have ripple effects across the globe. Critics of California’s efforts should look hard at the history. Not all initiatives succeed, either in business or in policy, but California has been respectful in its regulatory experiments. California is often an early policy leader, which makes getting it right especially complicated, but California’s policy experiments can pay off for people across the globe.
She cited three emerging privacy challenges:
1) The Internet of Things.
2) Fewer distinctions between public and private actors, such as NSA using Google’s cookies to track people. All government actors want more information to do their job, but we need to be sensitive to how government actors will leverage the private sector’s infrastructure. We need to avoid creating the perfect surveillance state.
3) Big Data. Big data may be reifying impermissible discrimination. We need to sensitize computer scientists to the ethics of their coding choices.
…
Chris Hoofnagle of UC Berkeley said notice-and-choice is based on rational choice theory, but consumers don’t always act rationally. He’s frustrated that defenders of notice-and-choice constantly move the goalposts about measuring the point of notice-of-choice. First, the goal was that all consumers would read privacy policies; then it became that experts would read them; then it became that the FTC would read them.Consumers misunderstand privacy, and this undermines markets for privacy. Almost everyone fails an online privacy quiz he administers. Digital natives do the worst on it.
He believes consumers see the words “privacy policies” as seals, i.e., certifications of minimum protections. He favors correcting this by establishing minimum substantive legal standards for anyone who uses the term “privacy policy.”
[My comment: If consumers are systematically misunderstanding the phrase “privacy policy,” that is a good argument to regulate the term so that it complies with consumer expectations. But California law requires most online companies to have a “privacy policy” and call it a “privacy policy.” I could imagine a different regulatory solution would let companies decide what to call their privacy disclosures, subject to minimum protections if the companies want to use the term “privacy policy.” I imagine most companies would choose alternative nomenclature (if the law let them) rather than dealing with the hassle of satisfying regulatorily required minimum standards. Also, if consumers consistently misinterpret the term “privacy policy” as a seal, it raises interesting questions about why the California Attorney General office is trying to make more online companies comply with California’s requirement to display “privacy policies.”]
Jules Polonetsky pointed out that he prefers the term “data use policy” over “privacy policy.” For example, many people are trying to manage their personal brands, and they may not think about that in “privacy” terms.
Aleecia McDonald said that it’s hard to get people to read privacy policies when they mistakenly think they are already protected. Companies rarely lie outright, but often there are omissions or ambiguous disclosures. Companies have trained people not to read privacy policies because consumers think privacy policies are useless.
…