March 15, 2010, 1:11 PM — The Federal Communication Commission's Consumer Broadband Test is drawing criticism from skeptics who are calling it too simplistic and inaccurate to yield useful data.
The FCC this week launched the testing program on its Broadband.gov Web site to help consumers "test their broadband service and report areas where broadband is not available." Taking the test requires users to list the address they're accessing the Web from and also whether they are using the wWb at home or at a business. From there, the test measures the connection's download and upload speeds as well as its latency and jitter.
The test has drawn criticism, however, for being too simplistic and not taking several different variables into account. Lauren Weinstein, the co-founder of the People for Internet Responsibility advocacy organization, listed several factors on his blog that the FCC failed to account for, such as the fact that user testing data might become skewed if a user conducts a test during peak congestion hours. Additionally, Weinstein criticized the test for not providing users with information about its underlying server testing infrastructure.
"As anyone who uses speed tests is aware, the location of servers used for these tests will dramatically affect results," Weinstein wrote. "The ability of the server infrastructure can be quite limited depending on the ISPs' own network topologies."
Weinstein concluded that the tests employed by the FCC, which actually use the same testing system as the Speedtest.net Web site, were only useful for helping "categorize users into very broad classes of Internet service tiers" and not much else. This is best illustrated, Weinstein wrote, by the fact that when he conducted the FCC's test on his own connection, the test "showed consistent disparities of 50% to 85%."
Brett Glass, the owner and founder of the Wyoming-based ISP Lariat Networks, also criticized the FCC's speed test for being far too simplistic and said that it couldn't account for some of the smart routing techniques employed on his network to optimize user experience.
"My network routes different types of traffic through different connections, which are optimized for that type of traffic," he says. "But the test doesn't 'know' this. It tries to access random, uncacheable data through our cache and thus gives results which are not typical of user experience… In short, the tests are 'dumb' tests designed for 'dumb' networks."