OK, I was on vacation this week, so I didn't write as much, and I wasn't planning on doing much work. But, I made a call to Microsoft's Christian Belady to discuss some energy saving ideas from the top of Stevens Pass.
If you haven't skied Stevens Pass, on the backside, you ski right under high voltage power lines from the Columbia Basin's hydroelectric dams, and you can get close enough in some areas that you'll feel get a slight charge holding your pole up in the air. So, thinking about power feels like the right thing to do.
After talking to Christian, I was thinking about his point on energy efficiency driving higher consumption, and where it could be illustrated in a specific case.
An interesting scenario is in virtualization. If you take an existing IT environment and virtualize servers, you assume a reduction in energy costs. But now, with users creating VMs easier than physical servers by removing the physical server barrier, how long is it before new VMs are created at a rate faster than the physical servers?
Will Energy Efficient VMs drive higher energy consumption over the long run?
It would be interesting to know after people have virtualized environments, enjoyed their energy savings, what happens to their energy consumption after users have adjusted to the new VM environment.
Are people missing the point because they are not thinking about life cycle management of servers, and what the impact is of virtulization?