In the future, as we speed down the motorway in our self-driving vehicles, historians will mark the 2010s as the decade of the cloud. Some would argue that the tenets of cloud computing were established in the 1960s, when U.S. government scientist J.C.R. Licklider planned an “intergalactic computer network”. In 2006 the cloud experienced a seminal moment, when Amazon entered the space with EC2. However, it is in the 2010s that cloud forged its role as the transformational technology of its generation.
The majority of leading brands embraced the cloud; and even the Central Intelligence Agency in the U.S. made the move in 2013. Some of the largest digital businesses in the world, Facebook, Netflix, Amazon, are not only cloud native, but achieved huge success because they chose that model.
In 2010 Amazon Web Services, Microsoft and Google – three leading lights of the cloud – had all launched their cloud businesses. The same year OpenStack, the leading open-source software platform for cloud, started as well. Worldwide spending on the public cloud started the decade at $77 billion, according to Statista, and was predicted to conclude it at $411 billion – a fivefold increase. Remarkable momentum, given that the cloud was still in its infancy. In the coming years, the development of business and consumer applications will accelerate from cloud enabled to cloud native – as exciting new cloud technologies flourish. Even the definition of what cloud means is changing, with the addition of edge and hybrid environments.
The world shifts too rapidly to make consistent predictions about the future; but commanding trends are at work and moulding the cloud as we head towards 2020.
Enterprises finalise transition to public cloud
Despite the hype around the cloud, not every business has made the jump. According to Forrester: “Cloud’s impact has been global, yet fewer than half of all enterprises use a public cloud platform. Yet recent research from 451 Research has shown that it is the financial services industry who is leading in terms of adopting cloud technologies. Faced with the agility from cloud native disruptors and increased competition, 60% of financial services companies surveyed said they expect to use various cloud platforms in combination with one another – slightly higher than the number for other businesses (58%).
Edge is not the end of cloud computing – but a natural evolution that will see telcos, manufacturers and more employing it as the new decade dawns
Indeed, as a McKinsey survey noted, even many companies that have adopted the cloud are far from complete operation in it: “While almost all respondents are continuing to build sophisticated cloud programs, there is a clear gap between the leaders (those who have migrated more than 50% of their processing workloads) and the laggards (those who have moved less than 5%)”.
A common concern putting businesses off employing an enterprise cloud computing strategy is security. Two-thirds of IT professionals state that security is their greatest concern in this respect, according to a study by LogicMonitor. As the decade draws to a close, the industry will seek to strengthen cloud security. Solutions to address compliance and data control needs will trigger adoption from those companies that are still holding out. Indeed, solutions that answer questions around data, as opposed to compute needs, will dictate who provides the most compelling answers for the enterprise.
The responsibility for security rests mostly with the customer though; and increasing amounts are deploying cloud visibility and control tools to lower security failures, according to the analyst firm. Strides in machine learning, predictive analysis and artificial intelligence will accelerate the number of large-scale, highly distributed deployments – as they become more feasible and secure to manage. Foolproof security does not exist in any computing environment. Nevertheless, increasing numbers of businesses will feel safer working with the cloud, triggering higher adoption rates by 2020; preparing the path for nearing total adoption in the following decade.
The cloud reimagined by edge computing
Consider cloud computing and typically centralised data centres, running thousands of physical servers, come to mind. However, this vision misses one of the greatest new opportunities for cloud – distributed cloud infrastructure. As businesses find themselves requiring near-instant access to data and compute resources to serve customers, they are increasingly looking to edge computing.
Edge computing directs certain compute processes away from centralised data centres to points in the network nearer to users, devices and sensors. IDC describes it as a “mesh network of micro data centres that process or store critical data locally and push all received data to a central data centre or cloud storage repository, in a footprint of less than 100 square feet”.
This environment is very valuable for the Internet of Things (IoT), with its requirement to collect and process vast amounts of data in near-real-time, with a very low level of latency. It can lower connectivity costs by sending only the most important information, as opposed to raw streams of sensor data. For example, a utility with sensors on field equipment can analyse and filter the data prior to sending it and taxing network and computing resources.
Edge is not the end of cloud computing, but rather a natural evolution that will see telcos, manufacturers and many organisations employing it as the new decade dawns.
Containers, which enable developers to manage and easily migrate software code, have become very popular. That is not going to change over the coming decade. Forrester estimates that a third of enterprises are testing containers for use in production; while 451 Research forecasts that the application containers market will grow 40% annually to $2.7 billion in 2020. 53% of organisations are either investigating or using containers in development or in production, according to a Cloud Foundry report.
The majority of businesses are leveraging containers to enable portability between cloud services from AWS, Microsoft Azure and Google Cloud as they firm up their DevOps strategies for more rapid software production.
Kubernetes is making waves in container deployment, by using operating-system-level virtualisation over hardware virtualisation. Vendors delivering pragmatic answers without getting caught up in the craze will achieve meaningful market penetration. The hype around containerisation is going to translate into widespread adoption as the decade turns.
Serverless computing grows in popularity
For some time organisations have developed applications and deployed them on servers. With serverless computing, a cloud provider manages the code execution, executes it only when required and charges it only when the code is running. With this model, businesses no longer have to worry about provisioning and maintaining servers when putting code into production. (“Serverless” is a somewhat misleading term, as applications still run on servers).
Serverless computing is not going to be an overnight sensation, but more of a natural route taken when usage increases over time
Serverless computing came into being back in 2014 at the AWS re:Invent conference, with Amazon Web Services’ announcement of Lambda and has recently gotten further traction with open source project Firecracker. Serverless computing, potentially, is a very big development, with one caveat. Not everyone is going to be ready for it. Going serverless requires an overhaul of traditional development and the production paradigm. In effect, it’s outsourcing entire pieces of infrastructure. In fact, it’s everything apart from the app itself.
This will mean that serverless computing is not going to be an overnight sensation, but more of a route taken when usage increases over time. While existing solutions usually lock customers into a specific cloud provider, the arrival of open source solutions in this space will accelerate and broaden the portfolio of implementations of serverless computing across the industry.
Open source continues its reign
Open source enterprise software has never been more popular. An increasing number of organisations are introducing open source software into their processes and even building entire businesses around it. Black Duck Software’s 2017 survey of executives and IT professionals identified that 60% of respondents reported that their company’s use of open source increased over the previous year. Two-thirds of the businesses surveyed contribute to open source projects. The cloud has ensured that the open source ecosystem is thriving, relying on a large range of open source DevOps tools, aggressive use of build automation and infrastructure platforms, like OpenStack and Kubernetes, unlocking application delivery in the cloud.
As cloud adoption increases, open source technologies will carry on boosting innovation for the rest of the 2010s and beyond. Cloud domination has been a staple of this decade and with such several exciting trends shaping it, it certainly seems the best days for cloud computing are still yet to come.
Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.
- » Why the future of application deployment is not a binary choice
- » Google Cloud acquires Alooma to bolster enterprise data migration capabilities
- » AWS will support NVIDIA’s T4 GPUs focusing on intensive machine learning workloads
- » Google Cloud launches new cloud storage plan to give enterprises more scalability options
- » Redis Labs further changes licensing terms – to make developers happy and keep big cloud vendors at bay