[IP] more on High-def could choke Internet, ISPs fear
Begin forwarded message:
From: Dewayne Hendricks <dewayne@xxxxxxxxxxxxx>
Date: May 16, 2006 12:43:47 AM EDT
To: Dewayne-Net Technology List <dewayne-net@xxxxxxxxxxxxx>
Subject: [Dewayne-Net] re: High-def could choke Internet, ISPs fear
Reply-To: dewayne@xxxxxxxxxxxxx
[Note: This comment comes from reader Thomas Leavitt. DLH]
From: Thomas Leavitt <thomas@xxxxxxxxxxxxxxxxx>
Date: May 15, 2006 9:07:47 PM PDT
To: dewayne@xxxxxxxxxxxxx
Subject: Re: [Dewayne-Net] High-def could choke Internet, ISPs fear
Dewayne,
Setting aside the question of whether, as currently designed and
deployed (or with reasonable upgrades), today's networks are incapable
of handling a significant increase in the proportion of streaming video
(of whatever quality) flowing over their networks from other networks...
and I believe there is a lot to dispute there (such as, exactly when do
these companies predict that more than 1 in 30 of their customers will
be spending at least two hours a day streaming broadcast quality video
into their homes?).
Setting that aside: there is a solution. One that has been widely
deployed in analogous situations before. One that many end point sites
use to avoid being swamped themselves and to bypass or avoid inducing
network congestion.
Caching.
In return for positing that there *is* an issue, I'm going to make this
supposition: the number of sources from which two solid hours of
streaming video are likely to be drawn is going to be limited... it is
unlikely in the extreme that even the most talented non-traditional
video sources, in the aggregate, are going to draw even a fraction of
the audience that, say, ABC.com is likely to attract (at least for the
immediate future).
What does this mean? Caching is feasible... as it would not be, for
instance, if 300,000 viewers were each to be pulling in 250,000
different streams of video... in all likelihood, the vast majority of
those streams will be redundant data. 50,000 people watching the same
episode of "Lost", using a distributed content caching mechanism,
translates into ONE retrieval over the foreign network... and only a few
more over the various segments of the internal network. Even if those
50,000 people watching various and sundry different episodes of "Lost",
it makes very little difference. Over the course of a week or a year,
this would balance out.
.... of course, the companies supplying the video would need to get over
their DRM fetish for this to happen (or at least co-operate on the
implementation of some industry standard distributed DRM authentication
solution).
.... but really, at least for the immediate future, we're not talking
about anything even vaguely resembling rocket science. Really, honestly,
do these networks really expect to be repeatedly transporting, from
server to each individual end user, the same multi-gigabyte video files?
No, of course not: they'll find it is vastly cheaper to implement some
kind of caching mechanism.
Even if, somehow, the integrated advertisements, etc. are
"personalized", the degree of personalization is still going to result
in quite a bit of redundancy.
This suggests a host of business opportunities, in addition to the
distributed caching technology platform, two examples of which would be
the distributed authentication mechanism mentioned above, and a
distributed content personalization system (for customized advertising
and promos)... this also raises the point that innovation in these areas
would be seriously stifled by bandwidth charges, as the incentive to
adopt such technologies would be reduced considerably, and even
contraindicated as revenue decreasing (and thus we, instead, would get
huge and unnecessary investments in greater network bandwidth).
Regards,
Thomas Leavitt
Weblog at: <http://weblog.warpspeed.com>
-------------------------------------
You are subscribed as roessler@xxxxxxxxxxxxxxxxxx
To manage your subscription, go to
http://v2.listbox.com/member/?listname=ip
Archives at: http://www.interesting-people.org/archives/interesting-people/