The speculation that Comcast is going to impose download quotas on their Internet services has got me seeing red and has blood shooting out of my nose. Granted, a 250 GByte limit per month seems like a reasonable limit, and I really really do understand their concerns with peer-to-peer, but there is more here that should be addressed.
Now, as it happens, I have access to certain information about the network utilization at a large State University. The campus, having recently completed an expensive DWDM fiber ring to a major city, now has access to commodity rates for Internet connectivity and in aggregate has around 3 G/s available. The campus housing system, which consists of dorms and greek houses, is attached to the Internet on a rate limited basis. When the limit was 100 M/s the housing network was using, strangely enough, 100 M/s. When the rate limit was increased to 200 M/s, the utilization increased to 200 M/s in the time it took to press the final keystroke to the time it took to display the new statistics. Doubling the limit again resulted in similar results, from which I conclude that the housing network would consume exactly as much of the aggregate bandwith as was made available.
Now I find it hard to believe that any group of students browsing the web, IM-ing their fellow students, or emailing their friends and family could generate precisely this utilization statistic. Clearly this is a result of around 30,000 peer-to-peer programs running on their machines. So I, as I said, understand Comcast’s attempted strategy for maintaining some semblance of control over their bandwidth utilization.
What has me seeing red is the fact that I have no control over my utilization. When browsing the web, in the new ‘World according to Google’, my downlink is jammed up with Flash code, streaming video advertisments, and an enormous amount of crap I really don’t want to see. It is annoying enough to check the weather at Intellicast for the local doppler radar, and have to wait for all the connections to googleanalytics, doubleclick, and the ad servers to complete before executing the java script to allow me to mouse over the drop down menu to get to the listed radar link.
It used to be that a responsible company, when designing a site on the web, would load the important stuff first in the top third of the screen and run the appropriate scripts immediately so that you could click through fast to get where you need to be. None of this waiting around for Flash to load and paint pretty, extremely annoying, and totally worthless graphics all over the index page (just to show how clever and artistic their designers are.) Damn! If I wanted to see that crap I would go to the local art museum, which by the way has a great section on computer art and generated graphics. And if Flash wasn’t bad enough, video streams for microwindows of auto advertising where the designers set the buffering level at 100% before execution. Egad! The same university I spoke of recently redesigned their index page away from a relatively useful portal to a completely annoying top third flash graphic all in the name of “Branding” the university.
If I want to read something, like a news article, at most I need 300-500 bytes in the body section. Instead, I have to wait through a header which loads all the javascript functions, the Flash loads, the video buffers, the fancy scrolling marquees, and my 500 bytes. Frantically mousing over where the menus should be and getting squat — watching the bottom left where the google analytics activity flickers, and waiting not-so-patiently for the “Done”, that’s bad enough.
But then to hear that Comcast is going to limit me to some number of bytes down to my browser before they charge me overlimit fees or trim my rate — crimson pulses throbbing down my chin. I vote for an option that provides for in-line culling of crap. Go to a web site and any connection to google analytics or a google ad server gets nulled out — before it is counted against your quota. Or better yet: The Better Web and Sanity Seal of Approval. Approved sites, without the crap, gets exempted from the quotas.
Either way, slapping quotas on the users, under the assumption that they are peering, without some consideration for what drivel is forced down the link is not looking at the whole picture. If I am running limewire, then I deserve it. If I am researching patents, or trying to get the latest stock quote, or reading the weather map, and get limited, then I am looking for another provider.
September 21, 2008 at 11:16 pm
I appreciate you trying to live up to the name of the site, but this particular rant is somewhat overblown.
1) Comcast has had these limits forever. Their only change in policy is to tell you what they are in advance, instead of only telling you when you’ve crossed them. They haven’t advertised their internet service with the word “unlimited” in quite a while.
2) Web pages with tons of crap around very little content is a problem because of the delays and annoyance you mention, but it’s unrelated to the cap. I would be shocked if the unintentionally downloaded material (for a heavy web user with no adblocking) exceeded 500mb in a month, or 0.2% of the cap. I would consider even 10gb to have basically no effect on whether you hit the cap or not… If you are that close, it’s because of something else. The most legitimate thing you could complain about would be high quality Netflix streaming video, but that’d still be quite a lot of movies before you got close to 250gb.
3) Anyone having problems with obnoxious ads should just get an ad blocker. If you use Firefox, there is a nice extension called AdBlock Plus that blocks ads, and even updates the types of ads it blocks automatically when there’s new ones. You can set exceptions for sites with relevant, tasteful ads that you don’t mind. I’m sure most other browsers have similar functionality, and most of the blockers do their work before you even download the ad, so you keep what little bandwidth you would have wasted on it.
What you should really be complaining about is the *way* that Comcast “throttles” peer to peer applications like Bittorrent. First of all, if they have a cap, they should let people figure out on their own how to keep themselves below it. But if they are going to throttle, they should just do regular throttling. Rate-limit at peak times and let it loose in the middle of the night, that sort of thing.
Instead, they are basically “hanging up” random connections by forging communications claiming the other side has ended the connection, and telling the other side the same thing about you.
It’d be like the phone company jumping into your conversation and saying to each person “nice talking to you, bye” when they needed to free up a line. THAT’s what has the tech community so upset with Comcast, and why their throttling practices are being investigated. This announcement about caps is just a side note.