After years of rapid cloud expansion, Australian enterprises are entering a new phase of maturity and recalibration.
As cost pressures intensify, regulatory scrutiny sharpens and operational complexity rises, organisations are taking a more layered approach to where workloads live and how infrastructure decisions are made.
Cloud strategies are maturing as enterprises prioritise cost control. iStock
The early wave of cloud adoption was defined by speed and scale. Hyperscale platforms offered flexibility and rapid deployment, and many organisations embraced a “cloud-first” mindset, migrating workloads en masse and planning to optimise later.
That next phase has now arrived, as CIOs and boards reassess earlier decisions with a sharper focus on cost predictability, resilience and governance.
“We’re not seeing wholesale repatriation,” says David Leen, head of product for cloud and managed services at IT services firm Interactive. “What we’re seeing is organisations bringing targeted workloads back onto private infrastructure as part of cost optimisation and control.”
Advertisement
One of the biggest drivers is financial discipline.
The consumption-based pricing model that underpins public cloud has delivered agility, but it has also introduced volatility. While elasticity remains compelling in theory, many enterprise workloads run continuously rather than scaling up and down, limiting expected savings.
David Leen, head of product for cloud and managed services at Interactive.
“The promise was that you could scale up and down and only pay for what you use,” Leen says. “In reality, most workloads run 24/7, so they don’t necessarily achieve the cost benefits organisations expected.”
Estimating future costs can also be difficult, particularly where storage growth, data transfers and transaction-based services are involved.
“You can get three-quarters of the estimate right,” he says. “But there’s always a portion that ends up being more of a guesstimate, especially around things like network egress or transaction volumes.”
Technology leaders are therefore under pressure to deliver more predictable operating expenditure. Monthly bills often fluctuate as usage evolves.
“It becomes a bit of a spiky journey,” Leen says. “CIOs want to know what they’re going to spend each month. Predictability matters when you’re managing large technology budgets.”
This is prompting a shift in thinking: not away from cloud, but towards more deliberate workload placement. Rather than wholesale reversals, enterprises are selectively repositioning stable, legacy-heavy or cost-sensitive systems into private or hybrid environments, while newer, customer-facing services continue to be built in the public cloud.
Many organisations also underestimated the complexity of modernising applications after migration. “A lot of companies moved through lift-and-shift, assuming they would transform later,” Leen says. “But transformation is expensive and difficult, and some workloads were never really suited to hyperscale environments in the first place.”
Luke Bartlett, solutions director at RBC Technology Group, says his company recently moved its business process management platform, docs2me, from public cloud into a private cloud environment with Interactive.
“This shift gave us full control over tenancy, workload management and access,” Bartlett says.
“Since partnering with Interactive, the platform has doubled its customer base. It wasn’t just about moving to private cloud, but about the engineering, architecture and support services behind it.
Luke Bartlett, solutions director at RBC Technology Group.
“The move has helped us navigate the hybrid cloud landscape more efficiently than we could have on our own.”
Legacy systems remain central to many organisations. “You can’t just dismiss decades’ worth of applications,” Leen says. “They’re reliable, integrated into the business, and often cheaper to run than transform. Some organisations just need those systems to operate effectively for another decade.”
Data sovereignty is another factor reshaping cloud strategy. What was once a theoretical concern is now operational.
“Two years ago, sovereignty was often lip service,” Leen says. “Now it’s coming up directly in RFPs. Organisations want to know where data is hosted, where engineers are located and who owns the infrastructure.”
For some organisations, that means prioritising environments offering clear jurisdictional control and local oversight.
“They’re prepared to pay a premium for sovereign capability,” he says. “That’s changed significantly in the past 18 months.”
At the same time, enterprise IT is evolving beyond the binary debate between public and private cloud. Hybrid environments – where workloads span multiple platforms – are becoming the norm.
“Hybrid isn’t about finding one tool that does everything,” Leen says. “It’s about having consistent policy, governance and visibility across all the environments you’re managing.”
Financial models are also being revisited. The assumption that all organisations prefer operating expenditure is proving overly simplistic.
“The idea that everyone wants to move to an opex model isn’t always true,” Leen says. “Some organisations still prefer capex because it gives them long-term cost certainty and aligns with how they manage their finances.”
Taken together, these shifts suggest enterprise cloud strategy is entering an optimisation phase – moving beyond rapid migration towards disciplined design about where workloads belong.
“Cloud adoption isn’t reversing. It’s maturing. Organisations are becoming much more deliberate about where each workload belongs.”