By Jared Newman | Monday, May 4, 2009 at 2:29 pm
Last week, The Times created a minor panic by reporting that we’re on track to run out of bandwidth by 2012, pointing to a study that blames greater demand than supply.
Unfortunately, the story is peppered with inaccuracies and sensationalizes the problem, according to one of the study’s researchers who spoke with me. Let’s start with the fact that the study by Nemertes Research isn’t “to be published later this year,” as The Times says, but rather dates to late 2008. More importantly, the claim that “cyberspace is filling up” is based on faulty assumptions about the research.
Nemertes’ study isn’t based on a model of actual bandwidth use, Senior Research Analyst Ted Ritter explained, because Internet service providers don’t make that data publicly available. As an alternative, Nemertes tracked the growth of computers, phones and other Web-enabled devices, and based its projections on how much bandwidth they could consume.
But here’s the thing: the term “demand,” as it applies to the research, doesn’t describe actual consumer use. If I subscribe, for example, to a lower bandwidth or speed tier than is physically possible with my Internet service provider, Nemertes doesn’t account for that. “We’re very clear in our research we did not base this on economics,” Ritter said. “It’s just a pure assessment of what’s the potential demand and whats the capacity.”
In other words, it’s less of a real-world model than a description of what’s possible if everyone with Internet access gorged themselves on bandwidth, and all the world’s service providers were cool with that.
Unsurprisingly, when Nemertes compared its methodology to publicly available usage data from the turn of the century, the study’s estimates on demand were a bit higher than the actual figures. Nonetheless, researchers felt the estimates were sound enough to make predictions about future supply and demand.
With all that in mind, there is some merit to the study’s claims. Consumers will, inevitably, demand more bandwidth as the Internet becomes a better vessel for streaming media. As that happens, infrastructure improvements will be necessary. Nemertes is concerned not with the core infrastructure, which can be improved easily, but with the “last mile” of cables that connect individual homes to a larger distribution point. These improvements require more manpower and greater expenses to replace. Whether we’ll face regular brownouts and a dearth of innovative but bandwidth-consuming Web uses, as the study claims, is up for debate, because there are external factors that must be considered.
To that end, it’s disappointing that Nemertes’ study shies away from the issue of bandwidth restrictions — opting not to make any recommendations that would alienate any of its “Fortune 200” backers because it’s simply not within the purview of the project, Ritter later clarified — because that’s what really makes this conversation relevant now. Time Warner Cable wants to experiment with tiered Internet pricing, but backlash from consumers and politicians put the idea on ice. Comcast has a well-known bandwidth cap of 250 GB per month. Other ISPs have hidden limits, and only inform customers when they hit the ceiling.
Unfortunately, Ritter and the study do not have much to say on these important developments, but I’ve read several stories that refute tiered pricing on the simple rationale that ISPs are earning more revenue than ever while costs are declining. These superficial analyses don’t take into account the costs of a last mile overhaul, or whether we even need one, and that’s part of the problem too.
Until we have a more holistic model of Internet use, quibbling over the date of our Internet doomsday is pointless.