How can data transmission speed in data centers be increased?

pwarren

A constant bottleneck in data centers is experienced when data is moved between machines. During busy moments, the level of congestion can cause significant issues and slow performance. Short of redesigning the entire architecture, which is much more than I am willing or able to do, what can be done to minimize this bottleneck in a data center?

Answer this Question

Answers

2 total
becker
Vote Up (11)

It may be just around the corner - wireless transmission seems a possible way to go once some still significant problems are ironed out.  IBM was working on some sort of wireless transmission using optical signals as far back as the 1960s, and 50 years later we are still looking at it.  It is going to require moving huge amounts of data across wireless spectrum, with the same requirements for reliability that are expected from current fiber optic cable.  Personally, I think it would make more sense to alter the accepted data center hierarchy, so that there aren't tons of hubs feeding into tons of routers in a pyramid fashion, which is just about perfect for creating the opportunity for bottlenecks.  But that's like saying I think cars should have six wheels - maybe that's a great idea, but it wouldn't be easy to change the paradigm.  

jimlynch
Vote Up (11)

Hi pwarren,

You might want to check out this article that covers memory virtualization and how it increases efficiencies in network resources.

Memory: The Real Data Center Bottleneck
http://virtualization.sys-con.com/node/1061473

"CIOs and IT managers agree that memory is emerging as a critical resource constraint in the data center for both economic and operational reasons. Regardless of density, memory is not a shareable resource across the data center. In fact, new servers are often purchased to increase memory capacity, rather than to add compute power. While storage capacity and CPU performance have advanced geometrically over time, memory density and storage performance have not kept pace. Data center architects refresh servers every few years, over-provision memory and storage, and are forced to bear the costs of the associated space, power and management overhead. The result of this inefficiency has been high data center costs with marginal performance improvement."

Ask a question

Join Now or Sign In to ask a question.
A new 2,000-sq.-ft. data center in Pennsylvania was designed to protect against an electromagnetic pulse, either from a solar storm or a nuclear event.
Tesla expects to use geothermal, wind and solar to achieve 100% renewable energy production.
Mayo clinicians will leverage Watson's natural language processing and data analytics capabilities.
New protocols will replace a 16-year-old standard that's way out of date.
Web server administrators who wish to trim bandwidth costs and hasten the delivery of their Web pages should take a look at a newly updated free module from Google designed to automate a number of techniques used to compress content.
Cisco this week revamped its UCS server line with systems designed to scale form the largest cloud deployment to those with only up to 15 servers.
U.S. data centers use more electricity than they need, a new report finds, and IT managers are too cautious about managing power and businesses are unwilling to invest in energy conservation.
U.S. data centers use more electricity than they need, a new report finds, and IT managers are too cautious about managing power and businesses are unwilling to invest in energy conservation.
With the Chinese government turning up the heat on foreign IT vendors, citing security concerns, IBM is finding help from an unlikely source: a competitor, local server vendor Inspur.
Tesla Motors and other manufacturers have set their sites on achieving lower lithium-ion battery costs through economies of scale, which should enable power storage systems for solar energy.
randomness