The speculation that Comcast is going to impose download quotas on their Internet services has got me seeing red and has blood shooting out of my nose. Granted, a 250 GByte limit per month seems like a reasonable limit, and I really really do understand their concerns with peer-to-peer, but there is more here that should be addressed.
Now, as it happens, I have access to certain information about the network utilization at a large State University. The campus, having recently completed an expensive DWDM fiber ring to a major city, now has access to commodity rates for Internet connectivity and in aggregate has around 3 G/s available. The campus housing system, which consists of dorms and greek houses, is attached to the Internet on a rate limited basis. When the limit was 100 M/s the housing network was using, strangely enough, 100 M/s. When the rate limit was increased to 200 M/s, the utilization increased to 200 M/s in the time it took to press the final keystroke to the time it took to display the new statistics. Doubling the limit again resulted in similar results, from which I conclude that the housing network would consume exactly as much of the aggregate bandwith as was made available.
Now I find it hard to believe that any group of students browsing the web, IM-ing their fellow students, or emailing their friends and family could generate precisely this utilization statistic. Clearly this is a result of around 30,000 peer-to-peer programs running on their machines. So I, as I said, understand Comcast’s attempted strategy for maintaining some semblance of control over their bandwidth utilization.
What has me seeing red is the fact that I have no control over my utilization. When browsing the web, in the new ‘World according to Google’, my downlink is jammed up with Flash code, streaming video advertisments, and an enormous amount of crap I really don’t want to see. It is annoying enough to check the weather at Intellicast for the local doppler radar, and have to wait for all the connections to googleanalytics, doubleclick, and the ad servers to complete before executing the java script to allow me to mouse over the drop down menu to get to the listed radar link.
It used to be that a responsible company, when designing a site on the web, would load the important stuff first in the top third of the screen and run the appropriate scripts immediately so that you could click through fast to get where you need to be. None of this waiting around for Flash to load and paint pretty, extremely annoying, and totally worthless graphics all over the index page (just to show how clever and artistic their designers are.) Damn! If I wanted to see that crap I would go to the local art museum, which by the way has a great section on computer art and generated graphics. And if Flash wasn’t bad enough, video streams for microwindows of auto advertising where the designers set the buffering level at 100% before execution. Egad! The same university I spoke of recently redesigned their index page away from a relatively useful portal to a completely annoying top third flash graphic all in the name of “Branding” the university.
If I want to read something, like a news article, at most I need 300-500 bytes in the body section. Instead, I have to wait through a header which loads all the javascript functions, the Flash loads, the video buffers, the fancy scrolling marquees, and my 500 bytes. Frantically mousing over where the menus should be and getting squat — watching the bottom left where the google analytics activity flickers, and waiting not-so-patiently for the “Done”, that’s bad enough.
But then to hear that Comcast is going to limit me to some number of bytes down to my browser before they charge me overlimit fees or trim my rate — crimson pulses throbbing down my chin. I vote for an option that provides for in-line culling of crap. Go to a web site and any connection to google analytics or a google ad server gets nulled out — before it is counted against your quota. Or better yet: The Better Web and Sanity Seal of Approval. Approved sites, without the crap, gets exempted from the quotas.
Either way, slapping quotas on the users, under the assumption that they are peering, without some consideration for what drivel is forced down the link is not looking at the whole picture. If I am running limewire, then I deserve it. If I am researching patents, or trying to get the latest stock quote, or reading the weather map, and get limited, then I am looking for another provider.