Anyone who followed the announcements at AWS re:INVENT last week will have seen a clear theme that covered most of the announcements made at the event.  AWS is focused on bringing services to the public cloud that concentrate on machine learning, AI and generating insight from large volumes of data.  The AWS team seems to be able to deliver new services at a rate that no-one else in the industry can, although Microsoft and Google are following close behind.  With so much focus on services in the public cloud, is private cloud set to die off?


IDC just announced their Worldwide Whole Cloud Forecast, 2017-2021.  This is a paid piece of work (to which I have no access), however, there are some nuggets of information available in the press release.  By 2021, spending on Cloud will reach $554 billion, double the figure in 2016.  The public cloud portion of that spend will increase from 41% in 2016 to 48% in 2021.  The hyper-scalers will account for 76% of the hardware and software spend on infrastructure.  There are plenty of other examples out there of the trend towards public cloud adoption.  Morgan Stanley, for example, predicted last year that 30% of Microsoft’s revenue will be from Cloud by 2018.

Numbers can be cut in lots of different ways, but the message is clear, we’re headed towards cloud in a big way.  What about private cloud in all this – does it have a future?  At NetApp Insight in Berlin, I spoke to Anthony Lye, Senior VP and GM of NetApp’s Cloud Business Unit.  You may have seen him on-stage with Microsoft in the day 2 keynote where Joe CaraDonna presented Cloud Orchestrator.  Mr Lye’s radical prediction is that private cloud will die out completely.  This seems at odds with the position NetApp currently finds itself in, i.e. that of an infrastructure provider.

Public Cloud Innovation

The premise for the prediction of the death of public cloud is based on the rate of innovation we see from public cloud providers.  New data-centric services are being developed and delivered rapidly by public cloud vendors.  As an example, AWS announced the following new services at Re:Invent:

  • SageMaker – an online tool for developers to build machine learning software.
  • Rekognition Video – an API for facial recognition and object recognition in live video.
  • Transcribe – an API to translate audio into punctuated text.
  • Comprehend – an API for performing sentiment analysis on text.
  • Translate – an API for performing language translations.

As an example of the speed of innovation, look at AWS Polly, a text to speech service that was announced at Re:Invent last year.  Then look at the AWS AI blog for Polly to see how the technology is being used and what new features are being released, pretty much on a monthly basis. Amazon is able to deploy new features and releases faster than any business could manage that process on-premises.

Time to Market

This is the key here.  AWS, Google, Azure & Co can bring these innovations to market faster than any IT department could.  So why bother running it on-premises yourself?  This assumes that the AI software is even available for installation in the private data centre.  In many cases, I suspect AWS is leading the market with some of their product features.

What About The Data?

Now here’s where things get tricky.  The hyper-scalers would love you to commit completely to the public cloud.  But putting all your eggs in one basket is a risky business.  By eggs, of course, I mean data.  Data is becoming a core asset for so many companies.  Whether through paranoia, regulation or compliance, many organisations will want to keep their data on-site.  This is one area where private cloud is headed.  It will be a repository for the core assets and services businesses don’t want to trust anywhere else.  Data will simply be made available to the public cloud services for analysis.  The other area I think is in bespoke services the public cloud chooses not to provide.  That might be high-speed trading or some other form of HPC that is unlikely to be used by all businesses.

The Architect’s View

I can see a future where private cloud and private data centres continue to shrink.  Like the mainframe, we won’t eliminate the private side, but it will exist in a niche form.  Nothing ever truly dies in IT.  There’s always someone, somewhere using punched cards.  The private data centre will become just that – a place to keep our data, with a few niche services thrown in for good measure.

What do you think?  Is the timescale for public cloud adoption too aggressive?  Do you think there will be a swing back to on-prem, or is the private data centre really doomed?

Related Links

Comments are always welcome; please read our Comments Policy first.  If you have any related links of interest, please feel free to add them as a comment for consideration.  

Copyright (c) 2009-2017 – Post #646A – Chris M Evans, first published on, do not reproduce without permission.

We share because we care!Share on Facebook0Share on Google+0Tweet about this on TwitterShare on LinkedIn141Buffer this pageEmail this to someoneShare on Reddit1Share on StumbleUpon0

Written by Chris Evans

  • Like you said, private cloud is not going away. Some things require
    on-premise deployment because that’s where the data is, or that’s
    where the people are.

    To me, the real question is whether enterprise IT can leverage private
    cloud quickly and effectively enough to preserve that value. Just
    over the past year I’ve encountered several enterprises where it takes
    months just to deploy a VM. That kind of legacy overhead could drive
    even the most obviously on-premise workflows into public cloud. In
    some cases, the value of public cloud has more to do with shaking up
    people than infrastructure.

    On the other hand, I’ve also worked with vendors and boutique cloud
    providers building some really innovative, niche services. These
    hosted services blur the line between public and private cloud. I see
    a growing market for what might be called off-premise private cloud.
    Specialized services that are hosted neither in any of the big IaaS
    providers nor on the customer’s infrastructure, catering to particular
    needs for storage, computing, or workflow. Whether that will grow big
    enough to matter on the nearly trillion dollar scale of the broad
    cloud market remains to be seen, but it does demonstrate that the very
    definition of cloud is still very much in flux.

    • Yes, it’s a spectrum of scenarios. Good point about the ability to use private cloud effectively. The majority of implementations are hampered by the operational aspects of how to account for the costs.

  • Well, private compute clouds have not enjoyed the rousing success every OpenStack supporter hoped for in 2010. OpenStack is still around although its commercial vendors have been reduced in number. CloudStack survives as an open source project of The Apache Software Foundation. And OpenNebula remains an open source project anyone can use to create public, private and hybrid clouds. These may not add up to a lot in therms of their numbers and they certainly don’t command the media attention private clouds had back in the day. But true to his word, Mr. Ballmer did take Microsoft all-in on the cloud, including private clouds with the release of Azure Stack this year.

    Hybrid compute clouds have been something of a rare species. Eucalyptus was one of the first to develop an AWS EC2 compatible hybrid cloud. Along the way, Eucalyptus, which at one point had a support agreement with AWS, was eventually acquired by HP where it became largely forgotten. Hybrid compute clouds have not really emerged as a potential force in cloud computing until recently. Microsoft is betting heavily that it’s Azure Stack will unite on-premises private clouds with Microsoft Azure in the public cloud. The major hardware OEMs are all backing Azure Stack (HPE, Lenovo, Dell-EMC and soon Cisco). Azure Stack will also run standalone and not connected to Microsoft Azure if you decide to run it that way. It will be interesting to see if Microsoft can make the sale on private/hybrid clouds with its enterprise and SMB customers over the next several years.

    As Chris pointed out, let’s not forget data storage. Data in its many forms is more valuable than compute. Hot and transactional data needs to be close to where the apps that need it are running. Warm, cold and archive data can be stored where it makes the most sense, and if you take a long term perspective, that means on-premises or in a regional data center. The value to be derived from data in the aggregate is now being realized. While it seems like an easy decision to keep all your data stored in a public cloud, that could prove to be a serious mistake. Public cloud storage providers own the infrastructure and restrict your physical access to their facilities. I’m not saying this will happen, but when anything becomes as valuable as your data, situations may arise where the relationship between you and your data could be compromised based on actions taken by public cloud storage providers where you data is essentially being held for ransom.

    Yes, private compute clouds could survive as a niche market, but private and community data storage clouds make sense right now for just about everyone. Organizations should be paying more attention to the value of their data and protecting it and deriving the value from it for themselves.