The decision of where to store your company’s data is as crucial as any other decision you make.
Your company’s data is your most valuable asset, intangible as it may be. Unencumbered access to your data gives your leaders the insights they need to make high-level decisions, and gives your front-line workers real-time access to everything they need to close deals and maintain relationships.
However, any outage or data breach can bring things to a screeching haunt. You’re facing mass downtime, as well as possible damage to your reputation (and even legal action) if your customers’ sensitive data is compromised.
That is just a glimpse of what is at stake. To help you make the most informed decision, here is what you need to consider.
What Tier is Required?
Generally speaking, The Uptime Institute’s Tier system is the industry standard for expressing a given data center’s capacity to store and handle your data.
The higher the tier, the better the uptime:
- Tier 4: Approx. 26.3 minutes a year of downtime with a 99.995% uptime rate
- Tier 3: Approx. 2 hours of downtime each year with a 99.982% uptime rate
- Tier 2: Approx. 22 hours of downtime each year with a 99.741% uptime rate
- Tier 1: Approx. 28.8 hours of downtime each year with a 99.671% uptime rate
The higher tiers also obviously come at a higher cost for all of their protections and redundancies. We strongly advise you not to make your final decision based on price, however, you can view data center pricing at UpStack by clicking that link.
Also, be warned that some centers may present themselves as Tier 3 equivalent or Tier 4 equivalent, at a lower price. These companies have not gone through the process or paid for the cost of certification, so they are considerably less reliable.
The Physical Location of the Data Center
This is another consideration that should outrank price in your final decision.
You clearly will want your off-site data stored far enough away from your current location that the same natural disaster or outage would not strike both of you. For example, if you’re in Boston, you do not want the same snowstorm to affect both your on and off-site data.
This is why colocation sites in areas that sit out of earthquake, hurricane and snowstorm zones are in high demand. You will also to ensure the site is accessible by major roads and highways. It also clearly needs to be on well-maintained and reliable power generators.
Remember: A distant colocation site does not mean access to your data will be affected, despite the distance. This may have been the case years ago. However, today’s technology lets you utilize a hybrid approach, leveraging both internal and external sites. This can close the gap in lag times when accessing your data or during disaster recovery.
As you can see, we barely touch on price and didn’t list it among your top 2 decision-makers. Certain stakeholders in your organization make be budget-driven and want to go with the lowest possible quote.
However, you can make the business case for focusing on the appropriate tier and the best possible location. It may be a matter of reminding them of the cost of downtime for your company, and the value of protecting your most precious asset.