Why organizations are ‘repatriating’ their data from the cloud

The growing expense of moving data in and out of commercial clouds has CIOs looking at smarter alternatives. How a new modernization financing model can help.
(Getty Images)

Bill Burnham is Chief Technology Officer, U.S. Public Sector at Hewlett Packard Enterprise. Prior to joining HPE, he served as CTO for U.S. Special Operations Command, overseeing enterprise architecture and technical modernization for SOCOM’s global network supporting users in more than 90 countries.

For much of the past decade, commercial and public sector organizations have been shifting a growing portion of their IT portfolio to the cloud. And with good reason. The cloud gave enterprises not only the ability to utilize IT services on-demand, but also a more flexible financial model for modernizing their infrastructure and applications.

Bill Burnham

Bill Burnham, CTO, U.S. Public Sector, Hewlett Packard Enterprise

But that financial model came with a catch: While storage and computing costs in the cloud offer tremendous economies of scale, the unseen fees of moving petabytes of data in and out of commercial clouds have grown enormously, costing far more than most organizations could have foreseen. As a result, many large organizations, including the likes of Dropbox that was built in the cloud, have taken the dramatic and paradoxical step of “repatriating” their data, according to a recent report from analysts at Andreesen Horowitz, the Silicon Valley venture capital firm.

What’s also driving organizations to take that step, though, is the ability enterprises now have to create and operate their own private clouds and deliver the same elastic cloud capabilities — with the same pay-on-demand consumption model on premises that they get from the big cloud service providers. The big differences: They don’t have to pay any data ingress and egress fees; they don’t have to move petabytes of data, with the latency that creates; and their data can stay secure in their data center, eliminating the risk posed to data in transit.

For state and local government agencies, the savings can be significant. Assume your agency contracts with one of the large cloud service providers and their fee to export or transact on data averages 3 cents per gigabyte. It doesn’t sound like a lot. But if you pull out a petabyte of data two or three times a week at $30,000 a petabyte — say to look for potential fraud in unemployment claims — the bills can add up astronomically. Or say you want to move 20 petabytes of data to a different cloud provider, or back on premises. That’s $600,000 that might be better channeled into upgrading your applications.

That doesn’t mean enterprises are moving away from commercial cloud service providers, nor should they. Cloud providers, after all, offer a level of flexibility, reliability and security that has proven to be game changing for almost every enterprise.

But as Target CIO Mike McNamara described it in a recent interview, it’s simply more cost efficient to build an advanced hybrid cloud environment. That means running most of an enterprise’s IT workloads on premises and relying on commercial clouds when and where it makes more strategic sense— like on Cyber Monday and Christmas week, in Target’s case, when transaction volumes can surge 20-fold over a typical week.

Alternative approach

The challenge most state and local agencies face, of course, are budget constraints. In the corporate world, IT investments can usually be mapped to revenue generation and profits, while in the public sector, IT costs are seen only as an expense to be managed. 

But another challenge is the tendency for executives to associate “cloud” with a place or a company, as opposed to a style of computing. As the Cloud Native Computing Foundation aptly explains it: “Cloud native computing uses an open-source software stack to deploy applications as microservices, packaging each part into its own container, and dynamically orchestrating those containers to optimize resource utilization.” 

The future for enterprise IT builds on that computing model in the form of a hybrid cloud environment that allows organizations to manage elastic workloads at the edge, the core data center or in a cloud service provider’s infrastructure, by capitalizing on the power and portability of containers and microservices. 

By broadening how we think about cloud computing, it’s also possible to think about alternative approaches to tackling the financial challenges that slow down modernization. 

Consider this example: HPE has some 1,100 current enterprise customers who have modernized their on-premises IT operations by upgrading to the latest available hardware and adopting a pay-on-demand consumption model. We bring a cloud native environment to you, but bill it as service, based solely on what you use, just as you do with commercial cloud providers — except without the big transaction fees, or any requirement that you move your treasure trove of data off premises. 

That frees up the capital — which agencies would normally spend to refresh their infrastructure — to instead be used to modernize an agency’s software by refactoring applications into cloud native services.

Say you refresh 20% of your hardware every year, budgeting $1.5 million a year. Take those funds and allocate $300,000 toward infrastructure usage payments and devote the rest to hire Python coders to modernize your applications over the next two years. Ideally by the time you get to year three, you won’t need $1.5 million for another round of infrastructure upgrades, because modern, cloud-native containerized applications launch when required and do not require as much infrastructure as traditional virtual machine-based applications. You’ll always have the latest available infrastructure and most importantly, you’ll have greater flexibility and cost control managing your enterprise’s vital IT workloads.

There’s a bigger reason, however, to adopt a more advanced model of hybrid cloud computing, beyond the added cost control it offers.

As organizations enter into the age of data-driven insight — where artificial intelligence and machine learning platforms will consume ever larger volumes of data — it will become prohibitively expensive and increasingly impractical to move all of that data to and from commercial cloud providers. 

At the same time, as cities and states continue to build out and integrate their operational technology systems — to manage buildings, traffic flow, utility capacity, emergency response and countless other public services — they will need to manage data workloads increasingly at the edge. That means a growing amount of data, by its nature, will need to be processed further away from the cloud service provider’s infrastructure, because managing smart cities will require immediate insights and the datasets will be too large to move. Do not fear, however — you can now put high powered computing at the edge of your environment and still take advantage of a “pay as you go” consumption model.

Going forward, agencies need to think beyond the legacy concept of “moving to the cloud” and plan for a broader hybrid cloud computing environment — one that operates from the edge to the core to one or more cloud service providers and is all available “as-a-service.” 

We’ve been watching — and helping — that evolution unfold for some time in the commercial and government markets, where HPE’s edge-to-core-to-cloud platform-as-a-service model, called Greenlake, now represents more than $5 billion in business and continues to grow. The most recent $2 billion award by the National Security Agency to HPE for complete large data centers under a Greenlake consumption model service is evidence of the maturity of the solution and the trust we’ve earned working with government agencies and their data. 

This should give state and local government IT and agency leaders confidence not only in the direction hybrid cloud computing is heading, but also in the financial mechanisms that organizations have available to them to modernize faster and more cost effectively.

Learn more about how Greenlake, HPE’s edge-to-core-to-cloud platform as a service model is helping enterprises manage their computing requirements more cost effectively,

Latest Podcasts