Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Paul MacKay

Is AI the next frontier of cloud vendor lock-in?

A graphic image of a cloud set in a digital background.

In October 2022, the UK’s communications regulator, Ofcom, launched an investigation into the practices of the cloud hyperscalers. Since then, Google, Amazon and Microsoft have been referred to the Competition and Markets Authority (CMA) – the competition regulator – putting the issue of cloud vendor lock-in firmly in the spotlight.

There have been multiple acts to this story, but recently, the CMA published the findings of qualitative interviews with hyperscaler customers as the first phase of the investigation. Like many in the industry expected, interoperability was the main concern, as technical barriers were making moving data or services between cloud computing services a significant challenge. Egress fees – which are charged for moving data out of a provider’s cloud – were somewhat surprisingly seen as less of an issue, with many of the organizations interviewed describing them as negligible.

However, this indifference towards egress fees changed when asked about AI.

A look to the AI-driven future

For the last two years, AI – particularly generative AI – has dominated the tech landscape. Organizations are now starting to move from the experimental laboratory phase of AI implementation, to deploying it in production with some initial use cases.

The hyperscalers have been at the forefront of AI investment. Microsoft invested $13 billion in OpenAI last year, setting the tone for a trend that has now seen the big three hyperscalers invest heavily. It’s quickly becoming a key battleground with each now able to boast exclusivity on certain AI products.

So naturally, the CMA asked questions about how cloud providers will impact AI in the future. This has been identified as a potential issue, particularly when considering egress fees. At present, the amount of data organizations are moving from one cloud to another is relatively small, hence the negligible egress charges. However, if an enterprise had most of its data stored in one hyperscaler’s environment but was keen to use another cloud provider to access their AI tools, the egress fees for moving that volume of data would be astronomical.

Organizations may want to use or train AI models on different hyperscalers for certain tasks. This means organizations must have the ability to easily move sets of data between clouds to another. Here, the constant flow of data back and forth would result in both egress fees and interoperability issues. This means organizations are faced with the choice of either having to stay with their current provider and only use the AI tools they offer, or spend months making their data suitable for migration. One customer from the CMA interim report made this exact point:

“One of the things that is a concern currently is lock in. So for our analysis work, we've used AWS, their tooling, their modelling and the lock in in terms of AI feels a lot stronger than it does in other services. For example, the optimisation of certain models on certain clouds would make it very difficult from my understanding to move elsewhere. But it's definitely something that we’re looking more into. I don't think we understand what the answer is currently. But it is a concern of ours, and the lock in is a big concern because I think it takes us down a certain way of using AI with certain models.”

Breaking down cloud barriers

The interim report makes it clear that while cloud vendor lock-in exists today, interoperability and egress fee issues will be amplified as AI becomes more widespread.

To be clear, a more flexible cloud market won’t result in a switching frenzy where organizations move all their data regularly. Because even if the CMA does take action to address interoperability issues, moving all of an organization's data between cloud environments would still be a mammoth task.

It’s more likely that customers will move subsets of data from one cloud to another depending on which AI tool they want to use – essentially adopting a highly flexible, multi-cloud model. Here, the hyperscalers are likely to see as much data coming into their environments as going out, with smaller data sets being moved more regularly.

This is why a modern data architecture will be vital for organizations looking to make effective use of AI moving forward.

Regardless of whether they’re changing cloud provider entirely or moving a subset of data from one hyperscaler to another to take advantage of AI, a unified data platform can help by providing a layer of abstraction that makes it easier to securely move data from cloud to cloud.

Hyperscalers shouldn’t be AI innovation gatekeepers

Choice and flexibility over which AI products to use will be vital, especially as organizations are likely looking to innovate with AI. But progress shouldn’t be unnecessarily restricted by a few companies. The ethos of cloud has always been one of flexibility and bringing people together. This should be reflected in organizations' freedom of choice over how they use AI. It’s then up to customers to ensure they can move data between clouds as and when they need to.

We list the best cloud cost management service.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.