Think about it: virtualization increases power use

Back to all seriousness. Part of working in the data center world is the handling of the constant stream of trends, possible trends and even newer trends. I posted a 'trends' post earlier this week, and now I found another one, this time on Processor.com. Now the nice thing is that every list of trends is different. Where the one I mentioned at the beginning of the week tried to signify the intertwinement of all data center aspects (and, unfortunately, fails to do that effectively), the one on Processor is much more focused on the thing that hounts data center professionals the most: efficiency.

Soooooo,.. another article about 'DC's going green!'??? Actually, yes, BUT with some very nice, twists. For example, the need for more streaming (video mostly, but cloud computing expands that need to all kinds of computing data) means the demand for bandwith and storage will keep on growing. So far so predictable, but that also holds implications for your HVAC installation. Sure you can cram all those systems on a smaller floorspace, but it does not make much sense if your HVAC is not designed to do that. So the trend will be that DC's will be moving to new locations even more, rather than to be modernised.

Another trend that can easily dive under the radar is that 'green' shifts from the 'mechanical' components (UPS's, batteries etc.) to the core IT. Why? Because making mechanical components more efficient at the current speed means that you actually make them less reliable, John Jankowski from JamCom tells Processor. So instead data centers start to focus on making the IT itself more efficient.

And yes, Processor cover the current most prominent method of IT efficiency: virtualization. Yes, it can increase power efficiency on a per-rack basis.. but it can also increase power use across the entire facility. Why? Computing density. Adding more servers to the same footprint simply increases the strain on the power supply. I recommend this article to everyone here.

Views: 49

Add a Comment

You need to be a member of The Data Center Professionals Network to add comments!

Join The Data Center Professionals Network

Comment by David Cappuccio on July 18, 2010 at 0:40
The key to virtualization is compute per kilowatt. Most x86 applications are running standalone on a server, and the average non-virtualized server is running between 7% to 12% utilization. But, that same server is also using up to 65% of it's total power. If I can virtualized a server and put 10 or 20 images on it rather than having them all run on their own physical servers the energy savings alone can justify the project. So, virtualization does NOT increase energy consumption - but it may shift the density requirements across the data center floorspace.

Connecting data center industry professionals worldwide. Free membership for eligible professionals.

Events

Follow Us

© 2024   Created by DCPNet Admin.   Powered by

Badges  |  Report an Issue  |  Terms of Service