Every organization requires a DevOps for the product/software delivery and deployment. Continuous integration and continuous delivery with Jenkins’ approach is a step towards the product/software delivery and deployment. Releasing software frequently to users is usually a time-consuming and painful process. Continuous Integration and Continuous Delivery can help organizations to become more agile by automating and streamlining steps involved in going from an idea, change in the market and business requirement to the delivered product to the customer. In this blog, we will see the complete lifecycle of integration and deployment.
What is DevOps?
DevOps is a software development approach that involves Continuous Development, Continuous Testing, Continuous Integration, Continuous Deployment and Continuous Monitoring throughout its development lifecycle.
There are four different stages involved in DevOps shown in the figure below.
- Version Control
- Continuous integration
- Continuous delivery
- Continuous deployment
Fig: DevOps Stages
Version control is a source code management stage helps to maintain different versions of the code. Continuous integration is used to continuous build, compile, validate, code review, unit testing and testing integration of the software. Continuous delivery and continuous testing are used to deploying the build application to test servers. Continuous deployment is used to deploy the tested application on the server for release with configuration management and containerization.
Continuous Integration and Continuous Delivery Pipeline
Continuous integration is like a complete software development life cycle. In the below diagram you can see the pipeline is a logical demonstration of how the software will move along with these various phases of the lifecycle.
Fig: CI/CD Pipeline Flow
Before the product/software delivered to the customer, it went through the following phases:
In the first phase of the pipeline, the developers put in their code and code goes into the version control system. After version control, its further proceeds to the build phase, where you can compile the code. You get all the various features of the code from various branches of repositories and then you can merge them and finally use a compiler to compile the code.
The code then proceeds to the unit test phase where you can have various types of testing, depending upon the type of product/software. In unit testing, you can break your complete software/product into small individual units. Each unit is tested and found working properly. Once the unit testing is done, integration testing takes place. All these individual units tested parts are integrated.
After unit testing, your product/software goes to the deploy phase where you deploy your software/product into the testing server where you can view the code. In the deploy phase, the code is deployed successfully.
At autotest, deploy to production and measure + validate, in these phases if any error occurs, the complete lifecycle takes place from the scratch and developers try to resolve the issue.
A pipeline is a logical step or a series of steps that defines how SDLC occurs.
Before Continuous Integration
Let us imagine a scenario where the complete source code of the application was built and then deployed on a test server for testing. It sounds like a perfect way to develop software, but this process has many flaws. I will try to explain them one by one:
- Developers have to wait until the complete software is developed for the test results.
- There is a high possibility that the test results might show multiple bugs. It was tough for developers to locate those bugs because they have to check the entire source code of the application.
- It slows the software delivery process.
- Continuous feedback of things like coding or architectural issues, build failures, test status and file release uploads were missing due to which the quality of software can go down.
- The whole process was manual which increases the risk of frequent failure.
It is evident from the above-stated problems that not only the software delivery process became slow, but the quality of software also went down. This leads to customer dissatisfaction. So, to overcome such chaos there was a dire need for a system to exist where developers can continuously trigger a build and test for every change made in the source code. This is what CI is all about.
Jenkins For Continuous Integration
Continuous Integration is the most important part of DevOps that is used to integrate various DevOps stages. Jenkins is the most famous Continuous Integration tool. Jenkins is an open-source automation tool written in Java with plugins built for Continuous Integration purposes. Jenkins is used to build and test your software projects continuously making it easier for developers to integrate changes to the project and, making it easier for users to obtain a fresh build. It also allows you to continuously deliver your software by integrating with a large number of testing and deployment technologies.
Now, consider that you have to automate the entire process from the time the development team gives the code to the time the code is deployed onto the production servers. We all know there is a step in the pipeline, and you have to automate this pipeline. To make the entire SDLC on the DevOps mode or the automated mode, we would need an automation tool.
Jenkins is one of the automation tools that you can use. Jenkins provides us various interfaces and tools to automate the entire process of the complete development lifecycle.Â Suppose you have a Git repository where the development team commits the code and then Jenkins takes over from there. Jenkins will pull that code and then it will move it to the commit phase where the code is committed to every branch.
Jenkins then moves the code in the build phase where it compiles the code. After the code is compiled, validated and reviewed the code is tested and once all the tests are done it is finally packaged into the application. It could be either a war file or a jar file.
Jenkins role is only till the application is packaged. Now if it has to be delivered then we need some tools to deliver the product. For this, we need a tool like Docker. Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. It takes a few seconds to create an entire server and deploy something on it. So, to deploy some software we will need an environment that replicates the production environment and that’s what Docker provides.
Fig: Working Flow of Jenkins
This blog summarizes how you can readily use integrated pipelines with your Jenkins projects. Automating each gate and step in a pipeline allows you to visibly feedback the results of your activities to teams, allowing you to react fast when failures occur. The ability to continually iterate what you put in your pipeline is a great way to deliver quality software fast. Use pipeline capabilities to easily create container applications on demand for all of your build, test, and deployment requirements.