TelcoNews UK - Telecommunications news for ICT decision-makers
Leasewebuk
Thu, 20th Nov 2025

Every IT professional is only too aware that the volume and value of data is growing exponentially. For many digital businesses, the rate at which they create data is accelerating daily. This reliance on data creates its own set of challenges – not only do businesses need to manage the increasing amounts of data, but they also need to consider the applications and services that naturally gravitate towards it.

This concept of data gravity, and its implications for shaping cloud strategy, is too often underplayed. Whilst smaller data sets might create minimal pull, over time the larger data sets strongly attract all the different components that contribute to system performance.  These orbit around a central data mass in the same way as planets operate in our solar system, making it difficult, risky and costly to break away from a single provider.

It is no surprise that some businesses are delaying cloud transformation or making piecemeal attempts to change their cloud set-up.  However, no matter how challenging it is to prise a swirling mass of data and workloads apart, organisations need to address this or risk falling into an intransigent vendor lock-in situation which puts operations and business continuity plans at stake.

The trap of data gravity

At first glance, holding data, applications and workflows with one cloud provider might seem like the most straightforward and convenient option. However, today's IT professionals are increasingly cost-conscious and seek flexibility to tailor their infrastructures to business demands.

Relying on just one provider creates more exposure to cost increases, exit fees, and outages, and makes it harder to adopt new services from other providers. Quite simply, the pull of data gravity means that migrating data and workloads to alternative cloud providers, or on-premises facilities, can be painfully complex, slow and very costly.

The storm created by data gravity is not restricted to IT. Left unmanaged, it has potential impact on key, strategic initiatives. Why? An organisation's adaptability and innovation is negatively affected if cloud infrastructure is not flexible or scalable enough to support business objectives. Essentially, data gravity is putting up a barrier to change and halting the innovation that cloud services are meant to deliver.

The concentration of data and workloads are also more vulnerable to system outages, security attacks, or sudden pricing changes within a single cloud environment. It is this heightened risk that has prompted many companies to evaluate resilience plans that build in redundancy and use more than one provider to ensure there are alternative environments to migrate or back up business-critical data.

Many businesses are also identifying data gravity issues with deploying and integrating artificial intelligence. AI workloads are creating significant volumes of data and these need to be near high performance AI compute. As a result, this has the potential to vastly exacerbate data gravity in the cloud along with vendor lock-ins. In this way, cloud providers and data centre operators have a responsibility to advise organisations on how to best mitigate against data gravity issues.

From pull to potential

For businesses in the grip of data gravity, achieving change starts with fully understanding how data is managed through its lifecycle, and matching data sets to a variety of storage options. This includes identifying what data there is, where it is stored, how critical it is, and how often it is accessed. Not all data is equal within an organisation, so valuable and current data sets will be suited to high performance cloud environments, with cloud storage carefully chosen to optimise the use of data. Some data may need to stay on-premise and other data can be archived, deleted or allocated to much lower cost storage.

This process alone facilitates more flexibility, changes how backups are organised and reduces the risk of everything gravitating to a single system.

The logical solution for many is to embrace a hybrid approach, where data and workloads are spread in a considered way across multiple environments. This could be using a multi-cloud set up with a mixture of public and private clouds or a combination of cloud and on-premise infrastructure. In essence, hybrid provides IT professionals with control and flexibility to do what they need for their organisation. It also enables more fluid cost management and provides more scope for business continuity planning.

To support this, organisations can use private networking, cloud exchanges and open standards to help lessen the cost and complexity of migrating data and workloads between environments. Today's businesses also need to adhere to compliance standards, so knowing what data is stored and where is key to ensuring that data sits within local and regional legal requirements.

How data is managed and organised is a strategic issue here to stay. This could mean implementing AI-based data management tools, restructuring storage, revising redundancy provision or introducing an agile hybrid cloud model. The organisations that recognise and act on this, and re-evaluate their approach to the cloud, will be the ones better prepared for a future which looks set to rely on yet further data growth.  

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X