Multicloud deployments don’t have to be so complicated

According to a report sponsored by SAS titled “A Silver Lining from Every Cloud,” most decision-makers at enterprises in the United Kingdom and Ireland face challenges from having data in multiple clouds (aka multicloud).

The report focuses on the difficulties companies encounter while depending on several public and private cloud platforms to store their business data and run applications. The most common complaints are poor accuracy, high costs, and slow speeds. In other words, multicloud is not making things better.

The report surveyed more than 200 decision-makers in data, analytics, and cloud services from companies with more than 3,000 employees. I have a hunch that the responses would be comparable to other markets in Europe, the United States, and Asia.

Note: Remember that biases may exist when the company sponsoring a report sells a solution to the problems listed in the report.

Resorting to trickery

The report indicates that, on average, organizations operate across three private clouds. Nearly half (42%) rely on at least two public cloud providers. These are hosts for business applications, analytics, and business data.

Enterprises reported problems such as multiple answers to the same question depending on where the cloud data resides (64%), high costs (64%), and latency in obtaining insights from the data (60%). Not good.

Those who leverage the data have found tricks to work around the limitations. A popular trick is to pull regular snapshots of data into a common database. Also, 70% use different analytics platforms on each cloud and must consolidate the answers, which often results in erroneous data and wasted time. People have to learn what they can and can’t trust and then create microsystems around the dysfunctional data to get their jobs done.

The result of all this trickery is a set of data platforms that need to be better integrated. All these problems, inconveniences, and workarounds are most likely caused by the lack of a plan before the data was migrated to the cloud platforms. Most of this should have been expected, given that we’ve been dealing with a lack of proper planning for years.

It’s not rocket science

The solution to these problems is not scrapping a complex cloud deployment. Indeed, considering the advantages that multicloud can bring (cost savings and the ability to leverage best-of-breed solutions), it’s often the right choice. What gets enterprises in trouble is the lack of an actual plan that states where and how they will store, secure, access, manage, and use all business data no matter where it resides. It’s not enough to push inventory data to a single cloud platform and expect efficiencies.

We’re only considering data complexity here; other issues also exist, including access to application functions or services and securing all systems across all platforms. Data is typically where enterprises see the problems first, but the other matters will have to be addressed as well.

A solid plan tells a complete data access story and includes data virtualization services that can make complex data deployments more usable by business users and applications. It also enables data security and compliance using a software layer that can reduce complexity with abstraction and automation. Simple data storage is only a tiny part of the solution you need to consider.

It’s likely enterprises will continue to struggle with complex and inefficient data usage, including security, governance, and compliance. That will drive needed changes after the fact, including a reevaluation of data integration, data security, and data connectivity solutions. Unfortunately, that will be like changing tires on a truck as it’s rolling down the road. The process will be disruptive, risky, and cost twice as much. We need to be much more proactive about technology planning, starting now.

Copyright © 2023 IDG Communications, Inc.

Source