Kenneth Brill, executive director of the Uptime Institute, wrote an eye-opening commentary on Forbes.com titled: “Servers: Why Thrifty Isn’t Nifty” yesterday. What really grabbed me was his introductory sentence: “We are currently in the biggest data center construction boom in history.” He postulates that this is partially due to Moore’s Law which states that the number of transistors on a chip could double every 24 months which has translated into a boom in IT and consequently an increase in global productivity.
Strangely missing from his article, however, is the mention of Cloud Computing, but more on that later.
Granted, his commentary is truly targeted towards the larger corporation or enterprise that are looking to build or use large data centers and understanding the financial and environmental impact of such. He summarizes the dramatic growth in a paragraph in a manner that is almost scary to read:
“The number of servers in the U.S. has grown from 5 million in 2000, to 10 million in 2005, to a projected 15 million in 2010. More servers eat up more electricity and energy costs go up. To avoid future energy shortages caused by increasing IT demands, 10 more power plants need to be built to the tune of $2 billion to $6 billion each and their cost is ultimately going to get passed on to IT through increased utility bills.”
Power is a concern for everyone, especially those who run large data centers. ServePath, the parent company of GoGrid, operates a 20,000 square foot facility in San Francisco, where real estate alone is expensive. Many large corporations (such as Google) whose livelihood is server hosting are building tremendous data centers near rivers in order to capitalize on more environmentally-friendly hydro-electric power.
But not everyone has the luxury or financial wherewithal to be able to do this type of massive construction. So there must be another solution. Many have chosen traditional hosting to accomplish this, but, as Brill points out, the CapEx for simply hosting a dedicated server is large (and growing). Brill estimates that a $2500 Servers (what he calls a “low-end server”) hosted in an optimal cost location in the U.S. will actually cost between $8,300 and $15,400, depending on the Tier level of the hosting facility.
(more…) «Looking Beyond Traditional Data Centers and into The Cloud»