Storage costs are one of the most under-optimized areas of spend in many IT budgets today. That means there’s a great deal of low hanging fruit waiting to be picked; the only trick is knowing how to get to it.
The key is to “right-size” consumption by establishing defined levels of service tailored to actual business need and behavior — ensuring the relevancy of all data matches the quality (and cost) of storage it occupies.
Let’s look at the four core steps required to make it happen to gain a high-level view of the process…
It’s impossible to know what your storage needs truly are unless you have a firm grasp on your organization’s data, how it’s used, and the rate at which it becomes “cold” — i.e. how long it takes to lose enough relevance to justify migration to cheaper, lower-performance storage resources.
The overall goal here is to identify:
Once you have a rough estimation of the different types of storage your data will require throughout its lifecycle, it’s time to create concrete service definitions for each “tier” of service IT will offer to the business.
Although the characteristics of each tier can vary depending on the organization, the overall number of tiers shouldn’t be greater than five. In most circumstances, no more than four tiers are needed to establish appropriate levels of service, but some organizations do choose to add a fifth tier (Tier 0) — usually in verticals like Finance that deal in high volumes of time-sensitive data.
Simply do your best to align the number of tiers in your model with your best knowledge of the data, being careful to ensure that each tier will deliver material efficiency to justify any additional complexity it brings to the model.
That said, it’s perfectly fine to start with a three-tier model and wait to add more tiers as needed because reversing unnecessary tiers is typically far more disruptive than introducing new ones.
With each tier of service clearly defined, you’re ready to calculate a set of business-facing unit rates.
This is essentially an allocation exercise in which you’ll capture all direct and indirect cost drivers to establish a total cost for each tier, then divide by storage volume (GB, TB, or PB) to find a cost per unit.
Establishing these fully loaded unit rates allows you to:
As your organization’s data landscape changes, it’s important to revisit your initial assumptions on how data should flow through each tier. The best way to do that is through continual benchmarking and what-if analysis — comparing actual utilization against your best estimate of “ideal” utilization.
For example, based on your best assumptions, you may conclude that at any given time:
Now, take that initial benchmark and compare it with actual utilization, then target the largest discrepancies you see for deeper investigation.
For organizations spending over $20M annually on IT, the task of right-sizing storage consumption is extremely difficult using conventional tools like spreadsheets alone. The complexity is simply too great.
That said, a dedicated ITFM/TBM solution can expedite the process considerably by letting teams spend less time working to simply understand the data they’re working with, and more time acting on it.
Reach out today to see how.