Fifteen years ago, companies were buying virtualization, storage arrays, various forms of compute, and networking fabrics separately
Then, in 2006, Amazon Web Services (AWS) started to consolidate the buying and implementation of these various technologies.
By 2010, the consolidation and synergies that had occurred with virtualization, storage, compute, and networking led to converged infrastructure, while software-defined storage/networking came into existence. The terms “public cloud” (with Microsoft Azure launching into that space, for example, and AWS Direct Connect launched soon after) and “private cloud” started being used.
By 2015, software-defined storage and networking concepts came together for the new category of converged and hyper-converged infrastructure; at the same time, public and private cloud were joined by hybrid cloud. Public cloud service providers started their own versions of the original AWS Direct Connect: Azure ExpressRoute and Google Cloud Platform Partner Interconnect. By last year, 2020, the cloud model started to deliver cloud data services for multi-cloud.
This pattern of convergence—of trends coming together—continues today. Looking ahead to 2025, cloud data services for multi-cloud will be offered by multiple vendors and will be ubiquitous. Multi-cloud delivers agility and scalability by allowing enterprises to avoid lock-in with a single cloud provider and will enable them to take advantage of services offered from across clouds.
Let’s take a look at how becoming and staying innovative in today’s data-first world requires the flexibility and scale of multi-cloud.
The Cloud-First Approach
“Cloud-first” has been the mantra for enterprise IT departments for a number of years. Scalability, agility, resilience, and the ability to choose pay-per-use services are a few of the business drivers for initially adopting public cloud.
An organization’s cloud strategy may begin with a single cloud, requiring management of only one vendor relationship. At the beginning of this relationship, organizations realize the initial benefits of the cloud and that provider is able to meet or exceed their expectations. Moving data into that cloud is easy and inexpensive—to start.
As that organization identifies innovative cloud services that it wants to use from competing cloud providers, it realizes that its data has been locked into their original cloud, burdened by the weight of data gravity and subject to hefty egress fees to repatriate, read, or move the data to a different cloud. At an average of $0.02/GB data, egress fees for moving 1PB data are $20,000—each time that data is moved!
Meanwhile, when someone in the organization identifies a reason to try another cloud, you give it a shot, meaning that you’re likely using at least two different cloud providers, adding complexity to budgeting and managing your cloud presence. According to the Flexera 2021 State of the Cloud report, organizations use 2.6 public and 2.7 private clouds, on average. Companies realize that they need more than just what the public cloud offers in order to gain better security, performance, and managed services. They look toward hybrid approaches, including hosted private cloud and private cloud. All too often, as plans grow, so does overspend, thanks to a lack of clearly defined goals and initiatives.
A Data-First Approach
Innovation is going to come from more than one cloud service provider, which requires a “data-first” approach that focuses on making data accessible by multiple clouds. This data-first approach inherently avoids vendor lock-in that inhibits innovation by unifying private, public, and hybrid clouds. Data-first initiatives also empower organizations to deliver effective data sovereignty strategies, to accommodate unique privacy requirements or county/region-specific requirements. This opens up possibilities not feasible with a cloud-first, single cloud approach.
Enabling Data-First: Convergence to Multi-Cloud
To achieve data-first goals, companies are turning to multi-cloud. As Santhosh Rao, Senior Director Analyst at Gartner said in 2019, the move to multi-cloud is an issue of “when,” not “if.” And that time is now. A true multi-cloud architecture is more than the sum of its parts — that is, it is more than simply using multiple clouds. Multi-cloud enables workload matching with the best-fit services and capabilities from any cloud, maximizing innovation and freeing organizations from data gravity and vendor lock-in.
Key themes converging to drive the interest in multi-cloud in 2021 are:
● Accelerated cloud spending within companies to deliver on innovation initiatives.
● Single clouds no longer meet business requirements. Two out of three companies need more than one public cloud in order to meet specific business requirements.
● Consolidation and/or decommissioning of legacy data centers, which may require expansion before consolidation is possible.
● Advantageous placement of essential company data, allowing users to unlock innovation without having to rehydrate or pay fees to move data in order to take advantage of the constantly growing list of services available from public clouds—300+ available from AWS, 250+ available from Microsoft Azure, and 100+ from Google Cloud Platform, for starters.
Multi-cloud allows you to use these hundreds of innovative cloud services by leveraging the strategic placement of a single, central copy of data. Effectively, a single, central copy of data provides benefits regardless of the volume of the data because it is adjacent to all public cloud service providers simultaneously.
By adopting convention over configuration, users can take immediate advantage of any innovation cycle from any one of the public cloud service providers by simply attaching to the same single storage. As such, this multi-cloud strategy drastically lowers otherwise duplicative costs and caps the technical debt of bespoke configurations.
That said, even if there are requirements for multiple protocol access or replicating between on-premises or another datacenter, the advantage of the same single storage is undeniable. Being adjacent to all public cloud service providers simultaneously unlocks access to innovation that would never be possible anywhere else.
Multi-cloud relies on a single copy of data accessible by multiple clouds simultaneously via simple network attachment, rather than duplicate copies of data in each cloud. It simplifies job management by relying on a shared file system rather than on separate instances across various locations.
Multi-cloud streamlines ease of use, with key workflows that run from a shared repository with a single change for all clouds, rather than requiring toolchains to touch each cloud where changes for each cloud are separate. It also provides a shared high-performance file system that doesn’t require collation.
Unleash and Unlock Innovation
Cloud data services for multi-cloud will soon become the new norm. Consider your need for it today. How frequently are you accessing data? How easily are you using the rapidly growing list of cloud data services? How can you maximize your strategic initiatives while minimizing cloud overspend? Don’t just look to a cloud; make your data available for multi-cloud initiatives.
About the Author
Jay Cuthrell is Vice President of Solutions Engineering at Faction, where he drives planning, implementation, monitoring, and optimization of cloud data services for multi-cloud solutions. Prior to joining Faction, Jay was a managing director at Dell Technologies, where he built and led diverse, globally distributed, and high-performing teams across pre-sales, sales operations, guerrilla field marketing, software engineering, systems engineering, platform engineering, and the office of the CTO.
Featured image: ©VectorFusionArt