Are Cloud Computing Data Centers Green? IBM announces its greenest Cloud Computing DC in North Carolina

I’ve been writing about cloud computing more as cloud computings are more efficient using less resources.  Here is IBM’s latest press release that demonstrates cloud computing is green.

The data center uses advanced software virtualization technologies that enable access to information and services from any device with extremely high levels of availability and quality of experience.  The facility aggressively conserves energy resources; saving cost and speeding services deployment through a smart management approach that links equipment, building systems and data center operations.

“I thank IBM for its continued commitment to North Carolina. This facility promises to be one of IBM's greenest data centers in the world, proving once again that green is gold for North Carolina,” Gov. Bev Perdue said. “Growing North Carolina’s green economy plays a critical role in my mission to create jobs and to ensure our state’s economy is poised to be globally competitive in the long term.”

As I’ve discussed the ideas working with University of Missouri, IBM has taken the same approach working with North Carolina Universities.

The data center is showcasing a cloud computing solution in partnership with North Carolina Central University (NCCU) and NC State University that enables Hillside New Tech High School students in Durham, NC to access educational materials and software applications for the classroom over the Internet from the high school’s computer lab, as well as from any networked device.  This means that the learning environment can be extended to nearly any place at any time without the restrictions many schools face such as limited support, hardware resources and lack of access. The Hillside outreach project with NCCU, using cloud computing as a vehicle in support of education, is one of several such K-12 projects that IBM supports.  The new data center also currently hosts IBM’s global web site, ibm.com, and the IT operations of strategic outsourcing clients such as the United States Golf Association (USGA).

The green features are listed here.


  • Smarter data center management:
      Thousands of sensors, connecting IT equipment, data center and building automation systems, provide data that can be analyzed to plan future capacity planning, conserve energy and maintain operations in the event of a power outage.
  • Energy efficiency: The data center uses half the energy cost to operate compared to data centers of similar size by taking advantage of free cooling – using the outside air to cool the data center.  Intelligent systems use sensors to continuously read temperature and relative humidity throughout the data center and dynamically adjust cooling in response to changes in demand.
  • Cloud computing capability:  Support for cloud computing workloads allow clients to use only the resources necessary to support their IT operations at any given moment - eliminating the need for up to 70 percent of the hardware resource that might have been previously needed to perform the same task. The data center also hosts recently announced “Smart Business” cloud computing offerings - each of these solutions can significantly reduce a clients total cost of ownership by up to 40 percent.
  • Built for expansion: Due to an innovative modular design method, IBM will be able to add significant future capacity in nearly half the time it would take traditional data centers to expand.  This design/build method – called IBM Enterprise Modular Data Center  (IBM EMDC) – also enables IBM to rapidly scale capacity to meet demand by adding future space, power, and cooling to the data center with no disruption to existing operations.  This means up to 40 percent of capital costs and up to 50 percent of operational costs may be deferred until client demand necessitates expansion.  The new data center can also quickly and seamlessly expand its power and cooling capacity.
  • New building standards: IBM started building the data center in August 2008 and it began to support client operations within 15 months compared to the industry benchmark of 18-24 months.

In constructing the new data center, IBM renovated an existing building on its Research Triangle Park campus by reusing 95 percent of the original building's shell, recycling 90 percent of the materials from the original building and ensuring that 20 percent of newly purchased material came from recycled products.  The result lowered costs and reduced the carbon footprint associated with building by nearly 50 percent allowing IBM to apply for Leadership in Energy and Environmental Design (LEED) Gold certification. LEED is a third-party certification program and the nationally accepted benchmark for the design, construction and operation of high performance green buildings.

Read more

Air Force and IBM partner to prove Cloud Computing works for Defense and Intelligence services

One of the top concerns about Cloud Computing is security of the data in the cloud.  IBM has a press announcement on the partnership here.

U.S. Air Force Selects IBM to Design and Demonstrate Mission-Oriented Cloud Architecture for Cyber Security

Cloud model will introduce advanced cyber security and analytics technologies capable of protecting sensitive national data

ARMONK, N.Y. - 04 Feb 2010: The U.S. Air Force has awarded IBM (NYSE:IBM) a contract to design and demonstrate a secure cloud computing infrastructure capable of supporting defense and intelligence networks. The ten-month project will introduce advanced cyber security and analytics technologies developed by IBM Research into the cloud architecture.

There are press articles too.

CNet News

Air Force taps IBM for secure cloud

by Lance Whitney

IBM has a tall order from the U.S. Air Force--create a cloud network that can protect national defense and military data.

Big Blue announced Thursday a contract from the Air Force to design and demonstrate a cloud computing environment for the USAF's network of nine command centers, 100 military bases, and 700,000 personnel around the world.

The challenge for IBM will be to develop a cloud that can not only support such a massive network, but also meet the strict security standards of the Air Force and the U.S. government. The project will call on the company to use advanced cybersecurity technologies that have been developed at IBM Research.

and Government Computer News.

What I find interesting is how few authors reference the IBM press release.  The goal of the project is a technical demonstration.

"Our goal is to demonstrate how cloud computing can be a tool to enable our Air Force to manage, monitor and secure the information flowing through our network," said Lieutenant General William Lord, Chief Information Officer and Chief, Warfighting Integration, for the U.S. Air Force. "We examined the expertise of IBM's commercial performance in cloud computing and asked them to develop an architecture that could lead to improved performance within the Air Force environment to improve all operational, analytical and security capabilities."

Which is cut and pasted into the CNet news article as well.

On the other hand, there are some good insights by Larry Dignan on his ZDnet blog.

What’s in it for IBM? Cloud computing has a lot of interest, but security remains a worry for many IT buyers. If Big Blue can demonstrate cloud-based cyber security technologies that’s good enough for the military it would allay a lot of those worries.

The advanced cyber security and analytics technologies that will be used in the Air Force project were developed by IBM Research (statement).

According to IBM the project will show a cloud computing architecture that can support large networks and meet the government’s security guidelines. The Air Force network almost 100 bases and 700,000 active military personnel.

and Larry continues on the key concepts of what will be shown.  Models!!! yea!

  • The model will include autonomic computing;
  • Dashboards will monitor the health of the network second-by-second;
  • If Air Force personnel doesn’t shift to a “prevention environment” in a cyber attack the cloud will have automated services to lock the network down.
  • Read more

    Elastra’s Cloud Computing Application Infrastructure = Green IT with a Model approach

    Elastra connects the power use in the data center to the application architects and deployment decision makers.

    Plan Composer function lets customers set their own policies based on application needs and specific power metrics (such as wattage, PUE, number of cores, etc.). Therefore, if an application requires 4GB of RAM and two cores for optimal performance, and if the customer is concerned with straight wattage, Elastra’s product will automatically route it to the lowest-power 4GB, dual-core virtual machine available.

    Gigaom has a post on Elastra’s Cloud Computing infrastructure addressing greener services.

    Elastra Makes Its Cloud Even Greener

    By Derrick Harris Jan. 12, 2010, 2:51pm 1 Comment

    0 0 33

    Elastra has incorporated energy efficiency intelligence into its Cloud Server solution, allowing customers to define which efficiency metrics are important to them and then rely on the software to route each application to the optimal resources with their internal cloud environments. Elastra’s efforts are just the latest in a growing trend toward saving data center costs by using the least possible amount of power to accomplish any given task. Especially in the internal cloud space, power management capabilities are becoming a must-have, with vendors from Appistry to VMware offering tools to migrate workloads dynamically and power down unneeded servers.

    Digging into the press release I found Elestra uses a modeling approach.

    Elastra accomplishes this through two technologies available in the product. The first technology is the ECML and EDML semantic modeling languages. ECML is a language used to describe an application (software, requirements, and policies) and EDML is used to describe the resources (virtual machines, storage, and network) available in a data center. These languages can be easily extended to enhance the definition of the applications and resources.

    These modeling languages coupled with the Plan-Composer in the Elastra Cloud Server enables users to synthesize a plan for execution. The Plan-Composer analyzes the proposed application designs (expressed thru ECML) and data center resources (expressed thru EDML), comparing them against a library of actions and outcomes. It then generates a plan based on the energy efficiency policies of the organization that can be executed by the Cloud Server against a customer’s infrastructure.

    The cool part is Elestra uses OWL and RDF to support their modeling approach.

    Elastic Modeling Languages
















    The Elastic Modeling Languages are a set of modular languages, defined in OWL v2, that express the end-to-end design requirements, control and operational specifications, and data centre resources & configurations required to enable automated application deployment & management.

    While the foundation of the modeling languages is in OWL and RDF, developers can interoperate with the Elastra Cloud Server through its RESTful interfaces; all functions available to the Elastra Workbench are available through this interface, which are based on Atom collections and serialized JSON, XML, or RDF (XML or Turtle) entries.

    Declarative models are useful ways to drive complexity out of IT application design and configuration, in favor of more concise statements of intent. Given a declaration of preferences or constraints, an IT management system can compose multiple models together much more effectively than if the models were predominantly procedural, and also formally verify for conflicts or mistakes. On the other hand, not everything can be declarative; at some point, procedures are usually required to specify the “last mile” of provision, installation, or configuration.

    Read more

    Starting a cultural change in IT, think about power as a precious resource, 2 monitoring tools

    Coming from the Gartner Data Center Conference where energy efficiency was regularly discussed. It is easy to think that what needs to be done is to tell people they need to change.

    The conference is still going on, but I am back home. And, have time to think.

    24 hours ago I had this view.

    image

    Now I have this view working from home. 

    image

    Cultural problem,getting people to measure power

    Someone at the Gartner Conference asked me how to bridge the energy monitoring problem between IT and facilities with organizational obstacles to collaborate.  There are plenty of people at Gartner and the vendors that would be ready for advice on a top down approach and how energy monitoring needs to be put in place, requiring big equipment deployments, monitoring software and consulting hours. 

    But, let me contrast a simple approach to the problem that doesn’t require a bunch of consultants.  Why contrast a different approach?  Because, I would rather sit at home and think of cool things than spend 50% of my time or more sitting in conference rooms on the road.  Which is also a lot greener.

    So, let’s start with some ideas that a typical consultant is not going to tell you.

    People don’t want to change

    People don’t want to to change their behaviors.  And change is resisted for illogical reasons.   I could go into the illogical explanations, but that is a whole long post.  An example of a problem is the resistance to implement and share information across IT and facilities on power used by various parts of the data center infrastructure and IT equipment.

    How do you address the resistance?  I fall back on ideas from my Aikido training where a sensei (teacher) explains being able to see where there is movement and blending with the motion is much easier than starting movement from none.

    Changing people’s thinking is difficult until they start to move their own thoughts. So, look for those who are already moving.

    I have been surprised numerous times to find people who have wanted to measure the energy consumption of IT equipment and data center infrastructure, but they didn’t have the tools or support.

    Seed the motivated with equipment

    Two Pieces of equipment to consider using are circuit monitoring and power monitoring power strips.

    Mike Manos blogged his experience using non-intrusive clamping device to measure power.

    I received a CL-AMP IT package from the Noble Vision Group to review and give them some feedback on their kit.   The first thing that struck me was that this kit seemed to essentially be a power metering for dummies kit.    There were a couple of really neat characteristics out of the box that took many of the arguments I usually hear right off the table.

    nvg

    First the “clamp” itself in non-intrusive, non-invasive way to get accurate power metering and results.   This means contrary to other solutions I did not have to unplug existing servers and gear to be able to get readings from my gear or try and install this device inline.  I simply Clamped the power coming into the rack (or a server) and POOF! I had power information. It was amazingly simple. Next up -  I had heard that clamp like devices were not as accurate before so I did some initial tests using an older IP Addressable power strip which allowed me to get power readings for my gear.   I then used the CL-AMP device to compare and they were consistently within +/- 2% with each other.  As far as accuracy, I am calling it a draw because to be honest its a garage based data center and I am not really sure how accurate my old power strips are.   Regardless the CL-AMPS allowed me a very easy way to get my power readings easily without disrupting the network.  Additionally, its mobile so if I wanted to I could move it around you can.  This is important for those that might be budget challenged as the price point for this kit would be incredibly cheaper than a full blown Branch Circuit solution.

    For monitoring individual IT equipment you can use a power monitoring strip like Raritan’s.  Here is an 8 port device.

    Dominion PX CR8-15

    Raritan's Dominion® PX Intelligent Remote Power Management Solutions help IT administrators improve uptime and staff productivity, save money and improve utilization of power resources.

    With the Dominion PX:

    • Emergencies can be resolved with remote serial and TCP/IP access to outlet-level switching, improving MTTR.
    • Capacity planning is simplified with unit-level and outlet-level power utilization information.
    • Staff can gather detailed power information to improve uptime and productivity.
    • Travel costs and time can be saved with remote power cycling and monitoring.

    Information provided by the Dominion PX — displayed at the strip via an LED display, and remotely through a Web browser — can be used to improve capacity planning through power consumption information for both the PDU and individual receptacle. Precise, outlet-level access and control allows users to reboot attached devices.

    There are many choices out there, and the above two will get you started on your search.

    Use a viral strategy

    I was talking about viral strategy and a person said I don’t get it.  “What is viral?”  Here is a good explanation of a viral ideas.

    What makes an idea viral?

    For an idea to spread, it needs to be sent and received.

    No one "sends" an idea unless:
    a. they understand it
    b. they want it to spread
    c. they believe that spreading it will enhance their power (reputation, income, friendships) or their peace of mind
    d. the effort necessary to send the idea is less than the benefits

    No one "gets" an idea unless:
    a. the first impression demands further investigation
    b. they already understand the foundation ideas necessary to get the new idea
    c. they trust or respect the sender enough to invest the time

    This explains why online ideas spread so fast but why they're often shallow. Nietzsche is hard to understand and risky to spread, so it moves slowly among people willing to invest the time. Numa Numa, on the other hand, spread like a toxic waste spill because it was so transparent, reasonably funny and easy to share.

    Buy some of these tools and give them to some of the people who want to measure energy consumption.  Tell them if they know of someone else that can use the tools, they can request an additional equipment deployment.  The one request you have is to get a report on what they discover is the energy consumption of their devices.

    As you discover useful information start to share the information. You will discover some interesting data.

    What are you after?  A cultural shift where people regularly talk of the kilowatts used by systems. Where these is waste, and where there are efficiencies.

    Keep in mind there is a viral aspect of the ideas. I wrote an article for Microsoft’s TechNet magazine last year.  Look at the below figure.  There was network switch that consumed 100 watts when powered off vs 350 watts when on.  This an example of something that would get people’s attention.

    Figure 4 Power-consumption comparison of on versus off

    You are driving for the same behavior change as those who drive a Prius with instant MPG of the car and how the hybrid system is running.

    Formalizing the power monitoring and data collection

    After you get some momentum you want to start to bring some structure in power monitoring data collection.  Here are some areas I would suggest next.

    1. What is the actual power consumption of the device at idle, off, under load, peak, and expected loads?
    2. What are the expected power changes in a minimum, maximum configuration vs. planned?
    3. Can any of the components be upgraded to energy efficiency?  Hard drives, power supplies, or processors?
    4. Is energy savings turned on in the server BIOS and/or OS?  How much do you save with power management turned on vs. off?
    5. Are there alternative designs that can be tested?
    6. The biggest waste is over-provisioning. Do devices have to be as powerful as originally specified?  Keep in mind, this saves money as well as power.

    Hope this help you think about how to change people’s behavior to ask “what is the power consumption?” whenever they talk about data center equipment.

    BTW, this time of the year, I can enjoy looking at the lake, but we don’t go out on the lake as the dock is under water. Having come from a desert (Las Vegas) I find it nice to return to a  water environment.

    In Chinese Taoist thought, water is representative of intelligence and wisdom, flexibility, softness and pliancy

    image

    Read more

    Gartner Data Center 2009 Conference – Day 2 – Green Data Center and Regulation

    Green IT is a hot topic here at the Gartner Data Center Conference with 250 people in John Phelps presentation.

    More and more enterprises are considering a green data center and what that actually means. This presentation looks at some best practices that can be done today and also looks at key green technologies and processes to consider for the future.

    Key Issues:

    • What critical forces will drive enterprises to consider green data center strategies during the next five years?
    • What best practices and processes should users follow when creating a green data center?
    • What are some of the new green technologies that are emerging that companies should be tracking?

    John covered a good overview of Green Data Center.

    Mike Manos’s presentation was a more specific drill down into the topic of Carbon Regulation coming.

    Regulation. It's Real. It's Coming. It's Expensive.

    Wednesday, 02 December 2009
    01:45 PM-02:45 PM

    Speaker: Mike Manos
    Location: Octavius 2
    Session Type: Solution Provider Session

    Energy regulation is coming. The US House of Representatives has already passed its Cap and Trade legislation and the Senate has a bill in committee. In Europe it already exists. The operational and cost impact on datacenters in the today's regulatory environment is substantial. In this presentation Mr. Manos will provide a detailed overview of the pending industry-impacting legislation and what you will need to do to negate its impact.

    Mike was as passionate as ever. Mike started off asking if Data Center Regulation is an issue. 80% of audience raised their hands.

    There are about 125 people in the room.

    One specific Mike drilled into Carbon Reduction Commitment CRC in the UK, and the impact of the act.

    CRC is designed to improve energy efficiency in large organisations. It will operate as a 'cap and trade' mechanism, providing a financial incentive to reduce energy use by putting a price on carbon emissions from energy use. In CRC, organisations buy allowances equal to their annual emissions. The overall emissions reduction target is achieved by placing a ‘cap’ on the total allowances available to each group of CRC participants. Within that overall limit, individual organisations can determine the most cost-effective way to reduce their emissions. This could be through buying extra allowances or investing in ways to decrease the number of allowances they need to buy.

    Let me drop to the closing statement.

    Preparing for Regulation.  What to do?

    1. Prepare for regulation. (make a plan)
    2. Measure energy consumption.
    3. May require work changes.
    4. Select appropriate tools.
    5. Need to determine how to look at data centers in aggregate. (holistic view)

    Overall it was good to see that the audience was engaged on the topic.

    Yeh!!!  Mike told the audience water is the next issue.

    Matt Stansberry was in the audience as well, so hopefully he’ll write something as well.

    Read more