Does DevOps Require The Cloud?

I am in the process of reviewing a draft of a friend’s book on DevOps and security, and one of the points that was initially asserted is that the cloud is required for DevOps. A cursory search of the Googles seems to support this, at least in concept. But do you have to use cloud services to successfully use DevOps?

What Is DevOps?

It is worth describing my understanding of DevOps: it’s the marriage of development and operations, removing the walls and the “hard bright lines”, and instead having both groups work together on product development, problem solving and general operations. It is about getting rid of the “maintenance programmer”, a dedicated role whose job was to fix the bugs created during product development. Instead, all developers are doing some form of “maintenance”, in that bugs are fixed as they are found. Those same developers are also working on new features, so a developer could find himself fixing a bug in one sprint, and adding a new feature in another.

The idea of blurring the lines, and making developers part of operations, as well as involving operations in development, has lead to an evolution in agile product development, as well as in technical and business operations for companies. While most of the early work on agile development focused on new features, and a bit on maintenance, it still tended to have a waterfall-like last step of handing everything off to operations. The DevOps movement’s goal was to remove that last barrier, and to treat operations as just another facet of being a developer.

From the operations side, it means deeper involvement in development of the product, but it also means that developers are now part of the operations team. Sure, you probably won’t give the developers full-on root access to production servers, and they may have to be prevented from seeing some of the customer data. But rather than the usual “file a bug report, trying to describe the fault; developer attempts to reproduce in their environment; (hopefully) fix and send it off for a future release”, developers see the actual servers and the actual problems in production. Everyone works together, without the traditional organizational  and procedural walls that kept developers and operations separated.

Cloud As Evolution

Some people talk about the cloud as if it is revolutionary. I’ve had one person claim that the cloud vs. your own datacentre is as different as the horse and buggy is from the car. Frankly, I find that naive, and completely ignores reality and history.

Cloud is, in my mind, more of an evolution. It’s a natural next step in infrastructure, in that it has become commoditized. What we call the cloud now could be called a “service bureau” from the mainframe days. There were plenty of organizations that didn’t own their own hardware or run their own datacentres, and instead leased time and space from the likes of ADP or IBM. When UNIX servers started to gain ground, that model fell out of favour, but it never went away entirely. However, the older models did have some limitations, and the biggest was set-up time. It could take weeks or months to get an organization setup on hardware (particularly if the hardware had to be bought and installed first). Mainframe installs were a big deal.

The cloud took that concept and made it scale, made it more responsive to the customer and made it cheap. With virtual servers and automated installs, a cloud provider can have a server ready to go within an hour. Some can provide dedicated hardware instances inside 24 hours. The ability to add capacity, sometimes literally on-demand as load changes, is orders of magnitude more flexible than the old model from the ’70s and ’80s. The cloud has made it possible for a 1-person startup in a basement to have enterprise-grade infrastructure in a matter of hours, for what amounts to pennies per day.

However, we are unlikely to have absolutely everything in the cloud. Why? Because there are times where, because of a combination of law and regulation, you may have to own your own servers, have them on your own premises, and be in direct control at all times. Granted, these are likely to become more and more rare and exceptional, but these situations do exist, and will continue to exist.

Unicorn or Failure?

One example sometimes used to link DevOps and cloud services is NetFlix. In its early days, NetFlix attempted to run their own infrastructure. They suffered a series of performance and reliability problems. Eventually, they moved it all into Amazon AWS, moving the burden of keeping the gear running to Amazon and allowing NetFlix to focus on their core business. The proponents of “cloud == DevOps” see this as a victory, and vindication of DevOps and cloud as partners. NetFlix is one of the unicorns for DevOps.

Taking a devil’s advocate position, you could look at this as a failure on the part of NetFlix. Thousands of companies, including many very, very large ones, are able to build and manage their own infrastructure. They are able to keep it reliable, secure, and to scale it based on demand. Obviously Amazon is capable of this, since they do this with AWS and their site. That NetFlix couldn’t, and felt it was better to use AWS, isn’t necessarily a “victory”, and it doesn’t prove that DevOps requires cloud. I’m not saying this is a failure on the part of NetFlix. Balancing the cost of “getting it right” vs. looking at alternative solutions is responsible and rational. But that they couldn’t attract the talent needed to keep their infrastructure up to scratch isn’t automatically a case to link DevOps and cloud, and isn’t necessarily an absolute “win”.

They Complement Each Other

DevOps is about a balance of speed, agility and accountability. It is about getting releases out quickly, responding to change and problems rapidly, but doing all of it in a way that is traceable, auditable, repeatable and (done properly) secure. None of that, however, is predicated on the cloud as a requirement. Using cloud services can help. Cloud services and DevOps certainly complement each other. Being able to rapidly set up a new server, or add capacity, in an hour or less brings a lot of flexibility to an organization and their services.

But to imply that you can’t have DevOps without the cloud is like saying you can’t be doing DevOps properly without using Java or Puppet or some other specific technology. DevOps is about automation as much as possible, and having a rapid and reliable process involving all parties to get product and fixes out the door. You can use DevOps principles on an iOS app using nothing by a couple of Macs and the continuous integration tools available in XCode and OS X. Granted, the deployment bit can’t be automated easily, but automatic deployment isn’t a requirement of DevOps. It’s an aspect that some organizations can take advantage of, but in some industries, your roll-out still may have to proceed in a manual fashion.

Certainly, DevOps benefits from the cloud (as it does from using open source tools, readily available CI/CD technologies and agile methodologies), but I’m not convinced you must use the cloud to successfully use DevOps. In some ways, it is the cloud services that have benefited from the rise of DevOps. But you can easily implement your own successful DevOps approach without it.