Tuesday, 16 July 2013

Accelerate Innovation - by increasing efficiency

I'm always highly impressed by the number of innovations coming from IT teams to tackle unique business challenges.  Some organisations leverage the power of Business Intelligence to improve marketing and operational analytics.  Other stories highlight the importance of connecting people and improving collaboration.  The more I read about IT innovators, the prouder I feel of our industry in general -> staffed full of amazing problem solvers, equipped with amazing technology to help them along the way.

But the life of an IT executive is not all about innovation.  That's the headline grabbing stuff.  If you'll pardon the duck analogy, there is an awful lot of paddling going on under the water to keep the lights on.  And when you look at the data - where up to 70% of the IT budget is being spent on operations, the struggle becomes immediately apparent.  

But what if there was a way of liberating some of that 70%?  After all, most datacenters are operating at less than 20% capacity at peak.  One UK customer told me that their windows estate was running at around 7% of capacity - during peak hours.  Even assuming a high-availability scenario, where capacity levels should be maintained at less than 50% --> there is still a significant amount of overspend on excess capacity: possibly as much as double.

Were we to apply some of our innovation capabilities to this problem, it should be clear that we as an industry are struggling with a conflict of interest around sizing.  Every risk-taking bone in our bodies is shouting out "more!" - the last thing we want to be associated with is a non-responsive service, crushed beneath the weight of demand.  Whenever we meet with a vendor, an outsourcer, a new market - the answer is always "more!".  

Entropy never decreases, and the voices for "more!" are louder and more consistent than ever.  And so we face a challenge.  "More!" is expensive.  Looking at Koomey's article on TCO, we can see that capital expense (software and hardware) accounts for about 1/3 of buying an asset.  In this model, a $10,000 server would have TCO of around $60,000 over a 3-year period.

A simple financial calculation should help deduce that running low utilization in datacenters may be wasting a huge amount of our IT budgets, on power, on facilities, on routine administration tasks.  For every 100 servers running at 10% utilization, that could equate to over $1M in potential savings every year.  

This has always been the great argument for virtualization.  Consolidate these windows servers, without introducing risks of compatibility, or costs for administration.  The great wave of virtualization has swept over us, leaving many organisations with highly virtualized landscapes, and little further room for optimization.

Except, that assumption is wrong.  Native sizing tools for virtualized landscapes depend on inefficient algorithms, using a flawed assumption - that every MHz is the same.  It turns out that we are still oversizing our virtualized landscapes - by as much as 2x.

The point of sizing efficiently is not an arbitrary one.  Every server that is not provisioned may be saving the enterprise $60,000 or more.  Re-investing these savings into creative projects will only help drive successful 
innovators, and generate more and more CIO headlines.

No comments: