Archive for June, 2011
About 20 years ago, a man I have come to know very well was sitting in a room with about 20 people. He was part of a multi-disciplinary disaster recovery and business continuity management team that included representatives from IT, business units, finance, compliance, security, procurement and operations, and after a series of disasters, that included a flood near the data center and a hurricane, they were updating the DR plan.
One of the exercises that this group had to walk through was determining which applications were critical. Of course, every representative from every business unit determined that their business-unit applications were critical. Finance and accounting determined that their applications were critical, and even the head of application development said his applications were critical. Why? If his applications were down, 1000 developers were not going to be able to work, which meant as much as $500,000/day in lost productivity, not to mention delays in delivery of new applications.
Have you ever used this phrase?
Well, I guess that’s a risk I’m just going to have to take.
I’ll admit, I’ve said it myself, and it’s usually because either:
- I don’t think the risk is real
- I think the risk is low enough, that I’m willing to take a chance
- I think the risk is real, but avoiding the risk seems impossible
- I think the risk is avoidable, but the cost to eliminate the risk seems too high
One of my friends told me about a guy who likes to drive fast cars, so he rented an hour in one of last year’s race cars, took a lesson from a professional driver, and then had a chance to drive by himself on a race track. What better place to safely try your hand at driving fast! The race track offered him an insurance policy for $70 that would cover any damage to the very fast and expensive car. Confident that he was sufficiently trained and that driving a car built for speed on a track designed for fast cars was an acceptable risk, he declined the insurance. Unfortunately, you guessed it, he wrecked the car, and it cost him over $100,000 to pay for the damages.
Many data center managers think like this guy. They view the risk of data loss as relatively unlikely. After all, the important data is protected by RAID and backed up on a regular schedule. They might even use asynchronous replication to copy most of the data to another location. These things do protect against most data-loss risk. Many of the other risks are relatively rare; things like hurricanes, floods, tornadoes, earthquakes, tsunamis, fires, and building collapse. Perhaps they think that this risk is sufficiently low that they represent acceptable risks. It’s a risk they are willing to take. But what if the company takes the wrong bet? As with the wannabe race car driver, the cost of being wrong is huge. And what if protecting against the other risks was as affordable as the $70 insurance policy? Shouldn’t the company buy it?
Think about what is possible with Axxana’s Phoenix System. It’s time to re-evaluate the notion of acceptable risk in your data center and your disaster recovery plan.
Data privacy has been in the news a lot lately, and reports of information compromise are frequent. Countries are getting serious about data privacy and are imposing stiff fines for failure to adequately protect personal information. According to King and Spalding, in a March 25, 2010, Corporate Practice Group Client Alert, ”The UK data protection authority, the Information Commissioner, will have powers to issue fines of up to £500,000 against companies who breach UK data protection laws from 6 April 2010.” King and Spalding go on to explain that the power to impose the fine can be exercised if “the Information Commissioner is satisfied that the breaches are ‘serious’ and of a kind likely to cause substantial damage or distress and provided the company either deliberately breached data protection laws or knew (or should have known) that there was a risk that a breach would occur but failed to take appropriate action.”
These laws are focused on the unauthorized release of personal information, not the protection of information against loss, deletion or destruction. At the same time, however, laws already exist in some industries, such as financial services, which mandate disaster recovery capabilities and disaster recovery testing. In addition, laws, such as the Safety Act, which requires the preservation of information, have been introduced in order to improve the ability of law enforcement organizations to locate individuals who are engaging in illegal activity on the internet.
What is interesting about about some of the information privacy laws is that liability can be assigned and fines assessed even when companies do not know that a risk of data compromise exists. The requirement is that they “should have known.” It’s a matter of corporate responsibility to know what is possible and take reasonable efforts to protect against bad events. It is not difficult to imagine that a similar responsibility test may be applied in disaster recovery and data retention laws. Organizations will likely be held accountable for what they “should have known,” if they failed to act. Given advances in data replication, deduplication, storage tiering, and data archiving technology, organizations should know that all data can be affordable replicated to multiple sites, all data can be protected through a wide range of disasters, and all data can be affordable archived. Consider yourself informed.