Authored By: Justin Loh, Country Director for Singapore, Veritas Technologies
Organisations are increasingly looking to multi-cloud environments to distribute workloads, mitigate risks, and achieve greater ROIs. However, many companies may not realise that the multi-cloud is far from being an all-encompassing panacea, leaving them exposed to ransomware attacks, service disruption, and a host of compliance and data management challenges.
It is easy to see the attraction of multi-cloud for enterprises. With a multi-cloud strategy, organisations could ensure portability and run multiple workloads across multiple applications without any vendor lock-in, maintain flexibility and room for infrastructure growth while optimising cost. According to a report by research firm IDC, over 90% of enterprises around the world will rely on a mix of multi-cloud and legacy platforms to meet their infrastructure needs by 2022.
The deployment of multi-cloud is on the rise. This could be driven by motivations spanning from cost savings to getting best-of-breed functionality, or simply organisations finding themselves with multiple clouds by happenstance due to the compounding decisions of various teams. The COVID-19 pandemic has further accelerated the transition as companies ramped up digitalisation to cope with lockdowns and the need to work from anywhere. While the cloud has been a great enabler in keeping business operations humming through the pandemic, there have been hiccups along the way, with a series of high-profile cloud outages globally.
Hence, before organisations decide on their cloud deployment strategy, the golden question is: can organisations secure and monitor their data in multi-cloud infrastructures?
A Constant Struggle
Although cloud innovation is evidently everywhere, many organisations across the globe struggle to define best practices in data protection to accommodate the cloud.
According to a recent cloud research, when asked about who is responsible for backing up cloud data, a whopping 84% of cloud architects and admins believe it is the cloud provider but they are wrong. Ultimately, this responsibility is left to the company itself. Despite the prevalence of cloud services, a majority of those responsible for maintaining cloud-based data remained uncertain of who is liable for its integrity and recovery.
The rise of multi-cloud has resulted in greater IT complexity and security risks. The recent Veritas 2020 Ransomware Resiliency Report revealed that companies with greater complexity in their multi-cloud infrastructure were more likely to make payments when they fell victim to ransomware. The average number of clouds deployed by those organisations who paid a ransom in full was 14.06. This dropped to 12.61 for those who paid only part of the ransom and went as low as 7.22 for businesses who didn’t pay at all.
Lagging Resiliency is a Threat
The impulse given to digital transformation initiatives, especially cloud adoption, has accelerated due to the global pandemic. Needing to support widespread remote working, enterprises are both creating more data and facing a business imperative to move their applications out of their own data centres to the cloud.
As this shift accelerates, resiliency planning has not kept pace, creating a resiliency gap. According to the Veritas 2020 Ransomware Resiliency Report, the increasing IT complexity from ambitious hybrid multi-cloud initiatives, coupled with lagging resiliency – including backup and disaster recovery measures that are not robust enough – is making too many enterprises an inviting target for malicious actors.
The report showed that only 36% of respondents said their security has kept pace with their IT complexity, underlining the urgent need for greater use of data protection solutions that can protect against ransomware across the entirety of increasingly heterogenous environments.
The research also revealed that complexity in cloud architectures has a significant impact on a business’s ability to recover from a ransomware attack. While 43% of businesses with fewer than five cloud providers in their infrastructure saw their operations disrupted by less than one day, only 18% of those with more than 20 cloud providers were as fast to return to normal.
Back to Basics
For all its promise, the multi-cloud comes with increased complexity that many organisations are struggling to keep pace with. For instance, greater complexity in an organisation’s cloud infrastructure made it less likely that they would ever be able to retore their data in the event of a ransomware attack.
This only underscores the importance of getting the basics of data protection right. Our research revealed that only 55% of businesses said they have offline backups in place, despite the fact that those who do are more likely to be able to restore more than 90% of their data. As organisations are turning to cloud services to support remote working, we are seeing a lag between multi-cloud adoption and the deployment of data protection solutions to secure them.
With more and more businesses around the world deploying hybrid multi-cloud infrastructures, and with hackers growing in sophistication and bold in exploiting security loopholes, we have the makings of a perfect storm if resiliency gaps in cloud environments are not addressed in time. The wider the gap, the more at risks the businesses are in the face of ransomware attacks, service disruption and regulatory non-compliance.
Businesses should look into adopting a proactive approach to ensure 24/7 availability of their data, applications and operating systems. This includes deploying new capabilities that would provide additional choices for deployment across the edge, core and the cloud while reducing risks, optimising cost, strengthening ransomware resiliency and managing multi-cloud at scale.
By doing so, businesses would be able to secure and get the most out of the data in the multi-cloud world.