Matt Wallace,CTO of Faction, asks how you can use the cloud to drive your Digital Transformation.
Cloud strategy is on my mind a lot. I’m sure I’m not the only one. No matter where you are in your evaluation of how to use the cloud to meet the needs of your business and of your customers, you’re likely trying to evaluate your plans judiciously.
How can you use the cloud to access emerging tech, scale-up, reduce costs, innovate and address the varied key initiatives that drive your Digital Transformation?
Finding out what others are doing can be helpful. The 2021 HashiCorp State of Cloud Strategy Survey provides valuable benchmarks on this front. But one stat in particular jumped out at me: 76% of respondents say that they’re already ‘multi-cloud’, which HashiCorp calls, ‘the de facto standard for IT organizations of all shapes and sizes.’
Behind that statistic is a complex, sometimes confusing, story of what ‘multi-cloud’ actually is and how it’s being adopted. Much of it is based on myths. These need to be debunked in order to fully understand and successfully implement a multi-cloud strategy that allows your organization to rely on streamlined usage of multiple clouds and multiple cloud services.
Myth #1: Multi-cloud has a single, agreed-upon definition
I define ‘multi-cloud’ as the approach of combining a variety of cloud services, supplied from more than a single cloud provider, for strategic benefit. These may include public clouds (such as AWS, Google Cloud Platform, Microsoft Azure, and Oracle Cloud), private cloud and can include certain SaaS services.
We can talk about why enterprises turn to multi-cloud, but first, let’s realize that the terminology is not settled. In practice, the term ‘multi-cloud’ continues to mean different things to different professionals. The term isn’t important, but understanding strategic intent and value is. As I like to say, multiple clouds are not multi-cloud.
The 76% of respondents who are already using multi-cloud may be relying on divergent configurations. Examples include:
● Using a data center for private cloud plus using a single public cloud
● Using cloud #1 for most workloads, with a single isolated service running in cloud #2
● Using two or more clouds at scale to access the best-of-class services offered by each public cloud service provider
When I discuss multi-cloud strategy, I do it with an eye on gains achieved from multi-cloud (or at least pain avoided), based on implementing a strategy or an architecture that uses multi-cloud to produce benefits.
If you have two teams-one deployed in AWS and the other in Azure-and ne’er the two shall meet: that isn’t multi-cloud in my book. If you have two teams – one team using data services in AWS to extract, transform and analyze data, and the second team using the data from AWS to leverage a service in Azure to visualize the data and create dashboards – that’s multi-cloud. This isn’t meant to critique the former scenario; this is not a matter of right and wrong.
Myth #2: Lock-in is all about cloud services and APIs
A lot of discussion around multi-cloud is about ‘lock-in’. The quest to avoid lock-in is almost as old as IT and it remains a somewhat futile endeavor. Lock-in is something that is created by the use of technology.
If you deploy or integrate something, you now have value you must recreate if you replace components. If your lock-in is low, your value is probably also low. This can be especially true in cloud, where often the most agility, the most innovation, and the largest benefits accrue to those who move fast, leverage a lot of higher-order cloud services and incur a lot of lock-in.
You can deflect a good portion of lock-in with multi-cloud ready tools, such as HashiCorp’s Terraform. But a more pernicious form of lock-in lurks, despite being discussed less: data gravity.
Data tends to attract more applications and data. This effect is known as ‘data gravity’. Applications create additional data daily. As the data grows, it becomes harder to move, making the challenges of getting data where it needs to be exponentially worse. Distance and latency grow, slowing access to that data – and preventing your organization from being nimble.
In order for cloud #1 to access data in cloud #2, you need a workaround. This may require copying, replicating or moving data. That’s possible, but operationally very cumbersome. You may opt to pay for two copies of data, endeavoring to replicate it between cloud environments, and incurring the cost, complexity and governance challenges attendant to that. Even a fully compatible application that can ‘run on any cloud’ may not be easily migrated, due to the data gravity of the data the application needs.
Myth #3: Software tools can solve for multi-cloud challenges
The organization behind the survey, HashiCorp does a stellar job providing a suite of tools that deliver value not only across multiple clouds, but specifically for multi-cloud. However, software is not a panacea.
You can normalize deployment, virtualize your storage and data, and many other things to ease multi-cloud adoption and enable the use cases, but there’s one thing you can’t do in software: defeat the speed of light.
Data gravity exists because data has to travel to be moved. You can’t beat the speed of light in software; in the real world, real implementations of data movement are much slower than the speed of light. To enable multi-cloud data access, the choice is clear: centralize data and focus on optimizing multi-cloud access.
By ensuring physical, logical and operational optimization of a multi-cloud data footprint, it is possible to create a scenario where it’s nearly as good as ‘local’ in each cloud at once – but only if you architect a solution with that end in mind.
Clear the confusion
Don’t believe that multi-cloud needs to be this confusing. Moving forward, a growing number of enterprises will be embracing a streamlined approach that eliminates the multiple, often puzzling, interpretations of ‘multi-cloud’.
The new de facto standard will become one that eliminates the multiple copies of data that complicate efforts and slow or even derail innovation. It will facilitate use of the rapidly growing assortment of truly differentiated cloud services. It will simplify governance and management of data.
The clearest path to cutting through the confusion: instead of working with data in multiple clouds, relies on a single repository of data that is accessible by and used in a true multi-cloud.
This more efficient and centralized architecture gets rid of the complexities currently contributing to and delaying multi-cloud adoptions. Relying on a single repository and a single automation stack reduces the compound problems and expenses created by having multiple copies.
It eliminates issues associated with copying or replicating data, along with associated fees; it also creates efficiencies in matching storage performance tiers to workload needs. This approach makes governance, protection and access initiatives as manageable as they would be for a single cloud. It unlocks all the power of your data and makes it more available to help you execute your innovative strategic vision.