<<< Date Index >>>     <<< Thread Index >>>

[IP] This is a long lawyerly comment and counter-comment on NN





Begin forwarded message:

From: Chris Savage <chris.savage@xxxxxxxxxx>
Date: June 22, 2006 12:05:37 PM EDT
To: dave@xxxxxxxxxx
Subject: RE: [IP] net neutrality, continued ...

-----Original Message-----
From: David Farber [mailto:dave@xxxxxxxxxx]
Sent: Thursday, June 22, 2006 11:41 AM
To: ip@xxxxxxxxxxxxxx
Subject: [IP] net neutrality, continued ...


Begin forwarded message:

From: "Yoo, Christopher" <christopher.yoo@xxxxxxxxxxxxxxxxxx>
Date: June 22, 2006 10:36:29 AM EDT
Subject: RE: net neutrality, continued ...

What is most interesting to me is the extent to which the network that
most people regard today as the Internet is already nonneutral.
Network
owners are caching popular content locally, which gives that content
speed and cost advantages.  Overlay networks, like Akamai (which
reportedly serves 15% of the world's web traffic, including Google),
are
taking this to a wider scale by maintaining a distributed network of
servers and using it to deliver content more cheaply and more quickly.

<snip>

Professor Yoo is of course correct that networks are not now operated in
any sort of purely "neutral" fashion.

To shift to law-mode for a moment, I think a lot of confusion in the Net
Neutrality debate has do with the hoary distinction in jurisprudence
between "rules" and "principles."

A first approximation for the non-lawyers here: the tax code is full of
RULES: Take this number, divide it by that number, place the result on
line 17 if it's greater than $57,206 and on line 19 if it's less.  Etc.
RULES are intended to direct or forbid very specific behaviors.

PRINCIPLES, on the other hand, are more general.  When driving you are
required to use "reasonable care."  If you don't, then you are negligent
and can be held liable, in a tort case, for the damages you cause.  And
though there are plenty of rules about driving, tort liability is based
on the PRINCIPLE of reasonable care, and is assessed on a case-by-case
basis.

"Net Neutrality" is a principle, not a rule.  Without getting into
endless and mind-numbing discussion of how the FCC might or might not
classify this or that IP-enabled service, what Net Neutrality is
basically about is the principle of non-discrimination.  The principle
of non-discrimination doesn't say that you cannot make any distinctions
at all as between customers, services, what you charge, etc.  It just
says that whatever distinctions you make, have to be reasonable.

So, Professor Yoo's discussion of particular ways that network operators
today treat traffic differently in different circumstances is kinda
beside the point.  It just shows that there are reasonable distinctions
that can be made.  E.g., sure, give live video packets priority over
email attachment packets.  That's reasonable.  Net Neutrality says,
though, that normally you shouldn't give YOUR video packets priority
over a COMPETITOR's video packets.

And, in the nature of principles versus rules, the specifics have to be
worked out on a case-by-case basis.  Then after adjudicators accumulate
enough experience perhaps more specific rules can be formulated.

But the inability to formulate iron-clad, water-tight, specific rules
now does not remotely imply that there's anything wrong with the
principle.

Chris S.



Begin forwarded message:
From: "Yoo, Christopher" <christopher.yoo@xxxxxxxxxxxxxxxxxx>
Date: June 22, 2006 1:22:17 PM EDT
To: David Farber <dave@xxxxxxxxxx>
Subject: RE: comment for ip?

I'm not sure that I agree with the characterization of the proposed
Internet-labeling statute as a generalized "principle" (or, as the legal
literature refers to it, as a "standard").  Classic standards are
usually very short, very general, and subject to
fact-specific/case-by-case interpretation in light of the totality of
the circumstances with no one consideration being controlling, e.g.,
unreasonable restraints of trade, reasonable care, etc.  The business
end of the labeling proposal includes a 200+ word definition of the
"Internet" that would not meet the conventional understanding of what
would constitute a standard.

But even accepting for the sake of argument that the proposed labeling
statute does advance standard, I'm not sure that adopting a standard
would be a good idea.  The standard criticism of standards is that they
are so open textured that it is impossible to say for certain whether
liability will exist in any particular case.  What that means in
practical terms is that there is no safe harbor for people who wish to
avoid liability and it is difficult, if not impossible, for an
adjudicating body to resolves case brought under the standard at a
fairly early stage of the proceedings.

This has been a particular concern in antitrust law, in which an
open-textured rule of reason essentially allows cases to get to juries
even on the most speculative of factual foundations.  Antitrust has
responded in many cases by making the rule of reason standard more
"rule"-like by adding on/off filters, such as antitrust injury, proof of
market power, and proof of "dangerous probability of success."  These
filters allow courts to dispose of clearly nonmeritorious cases early on
through motions to dismiss and motions for summary judgment.  The
clarity also provides some benefit plaintiffs with meritorious cases,
since it gives them an early indication whether they are likely to
prevail.

This is why the DOJ has issued its Merger Guidelines.  It is also the
source of criticism of the FCC's failure to provide advance guidance
about its merger clearance process.  At best, firms will have to wait
until several cases have been decided to find out what the real
governing principles are.  At worst, the multifactor balancing approach
will provide cover for the agency to justify whatever result it deems
politically expedient.

In short, reliance on standards has the potential to cast ambiguity over
all conduct and deprive network owners of a safe harbor in which they
can safely act without fear of incurring liability.  At a minimum, it
would require some sort of filter that limits complaints to situations
that could plausibly harm competition (such as whether the last-mile
provider offers a product that competes with the content or application
that is being restricted, since if they don't they have no plausible
incentive to discriminate).  Such a limitation would be a far cry from
the type of broad-brush approaches under discussion now, which would ban
all discrimination against any application or content without regard to
whether the last-mile provider offers a competitive offering.

As I argue in an article recently published in the Harvard Journal of
Law & Technology, I also am less optimistic about the FCC's ability to
police nondiscrimination mandates than some are.  To paraphrase the
Supreme Court's Trinko decision, the complexity of the interface between
network provider and customer gives the network provider a nearly
endless number of ways for it to intentionally or unintentionally
degrade the quality of service it provides.  As a result, supervising
nondiscrimination requires extremely close and intrusive supervision of
the business relationship.  As Gerry Faulhaber has argued in his
excellent paper on "Policy-Induced Competition," that has proven
insuperable unless the interface is simple and the information
requirements are low.  The increasing variety of QoS demands that
content and applications providers are placing suggests that
nondiscrimination will be very difficult to police.



-------------------------------------
You are subscribed as roessler@xxxxxxxxxxxxxxxxxx
To manage your subscription, go to
 http://v2.listbox.com/member/?listname=ip

Archives at: http://www.interesting-people.org/archives/interesting-people/