Archive for December, 2010
Roger Bilham, a professor of geological sciences at the University of Colorado, was quoted in an Associated Press article, saying, “It’s our fault for not anticipating these things. You know, this is the Earth doing its thing.” That quote was prompted by a very bad year filled with natural disasters. According to the article, natural disasters killed more than 260,000 people through November of this year. It was a bad year, especially compared to 2009, during which only 15,000 died from the effects of natural disasters.
One of the problems with natural disasters is that they are hard to predict. Actually, it’s not hard to predict that one will happen. That’s a certainty. The problem comes when you want to know when and where. But there are some things that we do know. We know that more people are living near earthquake fault lines. In fact, the article states that if the earthquake in Haiti had occurred in 1985, instead of 2010, the death toll would have been around 80,000, because there were so fewer people living in Haiti in 1985. We also know that more people are living in flood plains. And although there are people willing to debate the cause, we also know that the earth is getting warmer, which will increase certain types of natural disasters. Read the rest of this entry »
I spent some time this weekend looking at what others are saying about disaster recovery and found this article by Rajen Sheth: “Disaster Recovery by Google.” Rajen is a Senior Product Manager at Google. His article makes some very good points, such as stressing the importance of synchronous mirroring and having a disaster recovery facility located outside of the disaster zone. Of course, he also talks about the cost and complexity of managing multiple data centers.
As a product manager for Google, it’s not surprising that he then recommends that companies migrate to Google’s collaboration applications, such as Gmail, Google Calendar, and Google Docs. One of Rajen’s key points is that Google’s offering will deliver better disaster recovery, since it has an RPO design target of zero and an RTO design target of “instantaneous” or RTO=0. Google Apps are, in my opinion, a good example of where cloud-based applications are heading. Unfortunately, despite all the work that Google is doing, only a small portion of any company’s applications would be covered by the current suite of Google Apps. Read the rest of this entry »
I’ve continued to look at the data that was in Symantec 2010 Disaster Recovery Study. There’s a lot of very useful information in the study. Here’s some of what I found interesting:
• Only 20% of virtual environments are protected by replication or failover technologies
• 60% of virtualized environments are not covered in DR plans
• Actual downtime from outages is more than twice what companies expect
• 40% of DR tests fail to meet the RTO/RPO that have been set for the applications
That last one is very interesting. It’s hard to imagine anyone putting up with a 40% failure rate for long. I suspect some things will have to change, and soon. But given the tight budget times, it doesn’t mean that companies are going to spend more. In fact, 43% of companies said their disaster recovery budget would decline in the next 12 months.
At Axxana, our sole reason to exist is to provide disaster recovery capabilities to organizations, so you might think that declining budgets for DR are bad news, but they’re not. No, in the world of disaster recovery, when budgets get tight and service levels aren’t being met, something needs to change. And that’s when organizations look for new, more-innovative ways to provide data protection and disaster recovery. That’s what we offer. We have a new class of data protection, Enterprise Data Recording (EDR), that actually enables companies to meet RTO/RPO service levels, while lowering the cost of data protection.