Author: Anand Shah
At the top of wish lists for development teams and the IT leaders that manage them are methodologies in step with the demands of today’s business requirements. As it turns out, such methodologies lie at the intersection of DevOps and containers. It’s a place where mission-critical applications become much easier to release and update, essentially ‘at the speed of business’.
If you ask the heads of any enterprise development team for an opinion on traditional or so-called ‘waterfall’ development methodologies the likely response will be, “There’s got to be a better way.” For most of these development managers, that better way is agile development, or Rapid Application Development (RAD). Where waterfall generally defines a linear or step-by-step approach, RAD represents an iterative, team-based approach.
Based on these admittedly simplified definitions, RAD would be the preferred methodology. But RAD is not without its own warts. What if some developers aren’t as dedicated as they need to be to make RAD work? What if they aren’t in close proximity to other team developers? What if the nature of iterative development leads to constant rethinking or refactoring of initial project designs and architecture?
The good news is that these and other shortcomings of RAD do not mean a re-embrace of waterfall methodologies and all of their unpredictability, burdensome manual operations, OS- and infrastructure dependence, and resulting silos of information. This is where the combination of DevOps and containers comes in.
Quick take: What is DevOps?
Further, this DevOps/containers partnership is proving to be an essential element in the cloud native movement. With cloud native, applications are specifically designed to provide a highly automated management experience across public, private and hybrid clouds. Leveraging cloud in this way, organizations seek to boost both the scalability and availability of applications through a combination of life cycle automation, self-service and on-demand provisioning of resources.
Cloud native goes beyond RAD, and the combination of DevOps and containers is a primary driver.
The goal of DevOps is as straight forward as it is universally lauded: To shorten the systems development lifecycle while allowing speedier features delivery, fixes, and updates – all in close alignment with overall business goals. The key is a seamless relationship between development teams and IT operations, which historically have been woefully out of sync. DevOps is central to defining a culture that looks strategically at the software delivery chain in its entirety, harnessing shared services while promoting the use of new development tools and best practices.
Quick take: What are containers?
Historically enterprise software was delivered with a lot of other ‘baggage’ in the form of infrastructure dependencies, libraries and other components required for it to run properly. To decouple applications from these dependencies, IT began placing each application and its dependencies in its own virtual machine. But these VMs are often multi-gigabytes in size. What’s more, they don’t fix problems or enable goals such as portability, software updates or continuous integration and delivery.
Enter containers, which are relatively new arrivals on the development scene. Containers allow developers and development teams to package software, along with everything needed to run it, into a single executable package, such as a Java VM, an application server or the application itself. They make it simpler to host applications within portable environments. Containers are tiny in comparison to VMs – megabytes in size vs. gigabytes. And because they use so few resources compared with VMs, they start up instantly.
Viewed this way, a container is independent of the underlying infrastructure and therefore highly portable – say between the test system and the production system. Developers have come to see containers as a very efficient way to combine software components into application and service stacks that dovetail with business requirements for keeping software maintained and continuously updated.
Though the container concept has been around for decades, it was the emergence of Docker in 2013 that really ignited widespread container adoption. An open source tool, Docker makes it much easier to create, deploy and run applications using containers. Because it is a tool designed for both developers and sys admins, it is a key element of most DevOps efforts.
Raising software by the power of 2
There is no requirement that DevOps and containers be used in concert. However, the two working together make such a compelling value statement that the combination of both is almost universal today.
For starters, DevOps and containers together create a relatively easy solution for consistent development, testing and production environments. When developers and teams write then deploy applications inside containers, the environment stays the same at all phases of the delivery chain. With all different teams – development, testing and administration – each operating with the exact same container environment, collaboration is almost assured.
Containers also support regular updates, a core feature of continuous development and delivery in DevOps. They allow developers to update one part of the application simply by restarting the container (which as mentioned is instantaneous) without any interruptions to the rest of the application.
Finally, containers enable the agility needed in DevOps to easily switch between a variety of programming frameworks and development platforms. This feature of containers stems from the language- and platform-agnostic nature of containers. As a result teams can create any type of application written in any language within a container. Plus, containers can be transported between different kinds of hosts.
Provenir: DevOps and Containers @work
As a leading provider of risk analytics solutions, Provenir leverages containers and DevOps throughout all processes of continuous improvements to its various offerings.
Provenir teams also use Jenkins, the open source automation server that deftly orchestrates automated aspects of the development process and creates a virtually seamless workflow. By using both Jenkins and containers to automate critical processes, Provenir is able to quickly and efficiently onboard new clients.
Redesigning Loan Origination with Microservices
This 30-minute, on-demand webinar outlines the basics of microservices in financial technology and shows just how easy it is to turn your scorecards, risk models, and other components into services for use in a loan origination and decisioning process.
Watch the webinar