Keith BreedLondon, U.K. ---(METERING.COM) --- October 26, 2006 – The increase in the use of IT – particularly in the amount of power needed to operate the UK’s 1500 data centers – could lead to energy shortages in the future, according to research conducted by data center consultants BroadGroup.

An average UK data center uses more power in a year than a large city. “The data center is environmentally unfriendly,” says Keith Breed, a research director at BroadGroup. “The IT department has been divorced from reality as hardware costs have come down rapidly while computing power has risen dramatically. However, higher energy costs have not been factored in.”

BroadGroup suggests that energy-reduction should be made a core priority within IT departments. Breed estimates that a single UK data center's energy costs will more than double to about £7.4 million (US$13.9 million) a year by 2010, making the UK the most expensive place for IT in Europe.

One problem is that the servers needed to support current IT demand use between three and four times the power of traditional servers. A similar amount of electricity is also required to dissipate the heat generated by high-end servers. Hardware companies are looking at tackling these problems by designing technology that can run more applications with less power. However, Breed says that companies need to tackle the problem by investing in such technology and looking at better ways to utilise IT in a more efficient way.

One hardware provider that is focusing on the design of energy-efficient products is Dell Inc. The company has introduced two PowerEdge™ servers featuring AMD Opteron processors; the PowerEdge 6950 consumes up to 20 per cent less power than previous generation quad-socket PowerEdge servers, while the PowerEdge SC1435 is a dual-socket, rack-dense server optimized for high performance compute clusters that can deliver performance per watt improvements of up to 138 per cent.