Is multi-cloud the holy grail of DR or a vulnerability?
Disaster recovery software providers such as Zerto are rapidly adding multi-cloud capabilities into their DR solutions. In fact Zerto’s virtual replication 6.0 proposes to solve the ‘multi-cloud crisis’ (if there is such a thing). But what does multi-cloud mean? Simply put, it means that your workloads can be accessed via another cloud, enabling businesses to mix the best solutions and services from cloud providers. For disaster recovery, it means that you can move workloads from the cloud, to the cloud and in-between clouds. We believe this is a great thing as we have been expressing concern for a while about the risks of putting total control of your IT systems with a single cloud provider. Think about what happens if you want to change provider, or if your cloud provider goes bust. Unless you have a copy of your data you could find it hard to extract this from your current provider without penalties. Even worse – if your provider can no longer deliver you service (think of 2e2 as a prime example), you could lose critical IT service entirely and your data with it – or be forced to pay a through the nose for service to continue.
Many companies we speak to feel that the public cloud offers in-built disaster recovery, but it is not a sensible strategy to entrust both your production and your DR systems to a single cloud provider. Even Azure and AWS experience downtime – hardware failures, human errors, power failures, and even accidentally setting off fire extinguishers have been blamed for such incidents….. there’s no shortage of news about it. So why do companies think they are safe with no alternative for IT service than their cloud provider? Well, the fact that we now have multi-cloud capabilities shows that clearly they are concerned, and the market is starting to address these needs. Savvy companies are adopting multi-cloud strategies to ensure that they aren’t at the mercy of being locked into a single cloud provider.
If you have been clever enough to adopt a multi-cloud strategy, then should your workloads be offline, you would be able to move them over the internet to your secondary cloud. Sounds like a great way to mitigate IT downtime risk. But what happens if the internet is one of your key vulnerabilities? Take the instance of a DDoS attack – if attackers are flooding out your connection to the cloud then your workloads can’t be move via the internet to your secondary cloud. Multi-cloud fails to deliver disaster recovery here. With DDoS and Ransomware becoming two of the biggest causes of IT downtime, a multi-cloud approach won’t afford you guaranteed protection if you are using the public internet.
And we can’t escape GDPR at the moment – it’s everywhere, with the threat of big penalties for falling foul. Is a multi-cloud environment the way forward when we have GDPR to factor in? How do you keep track of your data? There is no guarantee of safety of data in this environment.
So what’s the answer?
Relying on the public cloud to offer guaranteed availability of your IT systems could lead to a disaster for your business in itself. One solution is to bypass the public internet with private connections between multi-cloud platforms. This could be one of the best ways to protect yourself against DDoS and Ransomware attacks. Another alternative is to have a virtual replica of your IT systems on an independent platform that is tested to create a ‘last know good’ every 24 hours. A completely separate copy of your systems that is verified and certified to be working without issues gives you complete peace of mind that you won’t lose everything.
If you would like to speak to Plan B about reducing you risk of IT downtime please contact us on 08448 707999 or email email@example.com