Regulatory Reform

The Perils and Promise of State Internet Policy

As state and local policymakers grapple with new digital trends, from Uber to Big Data, they’re increasingly coming into conflict with key federal limitations on their ability to regulate the Internet. In general, we should be skeptical about government’s ability to regulate the Internet smartly. But if any legislation is actually going to survive a court challenge, threading these needles is essential.

Dormant Commerce Clause

Under the Articles of Confederation, the federal government was powerless to remove barriers to trade between states. So the Constitution’s Commerce Clause empowered Congress to regulate commerce “among the several States.” This means states may not burden interstate commerce unless they can show that the local benefits from their law outweigh its burdens on interstate commerce.

Courts have struck down a number of state Internet laws because they violate the “Dormant Commerce Clause” (DCC): they effectively govern how people outside their borders use web sites and services. In a widely-cited decision, American Libraries Association v. Pataki, a federal district court struck down a New York law criminalizing online distribution of obscene content to minors. While states have a strong interest in protecting youth, it would not be “technologically or economically feasible” to limit the effect of New York’s law to users in New York because websites could not accurately ascertain a user’s age and location.

New York and other states, and cities like New York City, are trying to extend antiquated taxi regulations to Uber and hotel regulations to Airbnb. Incumbents are using regulators to block new competition; users are fighting back against regulatory capture. But it isn’t a DCC problem: the problem isn’t that Uber doesn’t know where its customers are. Because laws that govern how web companies deliver offline services can generally be applied on a state-by-state basis, they won’t violate the Dorman Commerce Clause.

But that’s not true for most state laws affecting purely online activity. If a state law has avoided, or survived, a DCC challenge, it’s generally because it requires only transparency. For example, a 2003 California law effectively requires websites to post privacy policies. Unlike Uber, websites generally can’t tell which users are in California, so the law effectively applies to all websites—yet no one has ever challenged the law, primarily because the burden is relatively low. But a more specific requirement about the content of notice or how to present it probably would be challenged. Since multiple states could enact conflicting requirements, even state-level transparency requirements that seem sensible could be struck down on DCC grounds.

We may soon see where courts draw the line if there’s a challenge to California’s recent amendment to its 2002 data breach notification law — which has long since been copied by nearly every state. Despite slight variations that make compliance tedious and not inexpensive, these laws haven’t been seriously challenged on DCC grounds. The key reason is that the current laws apply only when a narrow category of personal information is breached—so sites can generally deter- mine which state’s requirements apply to which users.

The new amendment now requires sites to post public notifications when log-in information alone is breached. That’s a good idea: it empowers users to protect themselves from a serious risk of losing other information by changing their passwords. But that doesn’t mean it’s constitutional: As the Supreme Court has said, “such requirements, if imposed at all, must be through the action of Congress, which can establish a uniform rule.” Because log-in information isn’t tied to a location, California’s new rule will essentially apply to the entire Internet. That doesn’t mean anyone will bother with the expense (and negative PR) of suing, but if they do, we may finally see just how far the courts will let states go in imposing idiosyncratic, web-wide dis- closure requirements.

Section 230

In the mid-1990s, several court cases made websites liable for defamatory content published by users. While policing such content might work on the scale of newspapers and letters to the editor, Congress astutely realized that such responsibility would significantly deter the kind of interactivity that has defined “Web 2.0.” So in 1996, Congress enacted Section 230, which bars holding the publishers of web sites, services, and apps liable for content created by their users except un- der federal intellectual property, criminal or privacy law.

State attorneys general have repeatedly tried to poke holes in this immunity in court, with little success. Generally, unless a website joins in creating illegal content, it won’t be responsible for it. The AGs have responded on two fronts.

First, they’ve resorted to extra-legal pressure to coerce companies to change their practices in ways they couldn’t legally require. Most notably, in 2008, state AGs browbeat MySpace into a “voluntary” agreement to perform an unprecedented degree of content monitoring. Some have speculated that the sheer amount of personnel resources spent on monitoring and compliance distracted MySpace from innovating even as Facebook was on the rise.

In 2009, South Carolina’s Attorney General threatened criminal charges against Craigslist’s management unless they shut down their “adult services” category. Craigslist asked a federal court to block such charges. The court said the request was premature, but legal experts agreed that Section 230 barred any state charges. South Carolina’s AG gave up—yet, under enormous pressure from other states, Craigslist eventually caved anyway.

Second, state AGs have demanded the power to directly enforce federal criminal laws, such as concerning prostitution against online intermediaries like Craigslist—instead of focusing on enforcing their existing laws against actual child predators. Earlier this year, all but three state AGs signed a letter demanding that Congress amend Section 230 to allow them not only to enforce federal or even state prostitution laws, but to hold websites liable under any state law. This would mean that any of America’s 27,000 state and local prosecutors could threaten to shut down any website because one of its users violated any of the thousands of idiosyncratic state laws on the books, including odd misdemeanors like selling spray paint to minors.

In September, ALEC firmly opposed the AGs’ sweeping demands. It’s unlikely Congress will ever take up the idea, which would prompt intense Internet opposition. But the fight is far from over.

The Positive Agenda

What else should state legislators do? When it comes to new laws, they should keep in mind some simple rules:

  1. To respect federalism, states shouldn’t try to regulate the Internet in ways that can’t clearly be limited to users within that state.
  2. To respect Section 230, state legislatures will have to steer clear of any law that makes websites responsible for what their users do—and keep an eye on efforts by their attorneys general to circumvent Section 230.

Two specific reforms should top their positive agenda. First is ensuring that state laws protect us all from groundless, unrestrained snooping by prosecutors and even private lawyers acting as officers of the court in civil matters like divorce. Congress is working on some of these issues, but only very slowly, and other issues, like seizures of electronic devices incident to arrest, are matters for each state to address.

Second, instead of trying to gut Section 230, the law that has made user-generated sites from eBay to Airbnb possible, states should en- act the obvious corollary: just as the threat of liability under state law shouldn’t be used to shut down lawful websites, it shouldn’t be used to silence individual users who say truthful, but negative, things on- line. Some states have already enacted protections against what are generally called “Strategic Lawsuits Against Public Participation” but most haven’t yet passed laws that protect not only journalists but also those who post comments or reviews online.

This isn’t just a symbolic parallel: truthful, negative reviews are essential to the reputation markets that protect users on sites like Uber and Airbnb. They reward good service and punish bad service.

That’s the future of consumer protection: more transparency and, yes, more data.

tech-freedomBerin Szoka (@BerinSzoka) is President of TechFreedom, a tech policy think tank based in Washington D.C. He is an Internet layer, has testified before Congress three times on consumer privacy, and is a member of ALEC’s Task Force on Communications and Technology.


In Depth: Regulatory Reform

In his first inaugural address, Thomas Jefferson said that “the sum of good government” was one “which shall restrain men from injuring one another” and “shall leave them otherwise free to regulate their own pursuits of industry.” Sadly, governments – both federal and state – have ignored this axiom and…

+ Regulatory Reform In Depth