Thursday 24 January 2013

Consumer/Provider : the twin forces in Capacity Management

Those schooled in traditional IT capacity management have long recognised the cause and effect of observed system behaviour.  Few have managed to bridge the gap in quantifying the correlation, however, and for good reason.  Straying too deep into this territory can leave you struggling with data overload, and no way of mapping volumetric and utilization data together. The age of the CMDB and automated discovery and mapping has changed the landscspe in thus regard.  Finally, using configuration mapping to correlate volumetric data against utilization data can be done reliably, consistently and accurately since all feeds are automated.

Correlating service throughput against observed utilization provides intelligence to optimise design, streamline performance, & predict and optimize application scalability.  But in a consumer/provider scenario there are two contexts to consider. Presenting the customer with data about your underlying infrastructure utilisation lays bare the margin or risk levels of your operating model. Equally, the customer's main concern is ensuring their service levels are not jeopardised, and they are not burdened with excessive costs for underutilized environments.

Despite the advantages commonly sought in quantifying the capacity of the physical environment, it is the capacity of the contractual environment that is crucial to the customer. In a cloud context, the provider must diligently ensure the reliability if their operating model. This is crucial to brand equity. But the customer's primary concern will be in managing the flexibility of their service based contract, and ensuring that risks are properly balanced against costs.

No comments: