My journey of embedding security in a DevOps project
A few months ago a project I was consulting for, was facing the double whammy situation of being short staffed and forced to adhere to restrictions of cash preservation, that has become the norm today. The project had to buttress the nascent security in place with enhancements that are as cash efficient a way as possible.
This blog is based on my work solving the problem using some of the free tools (as in free beer) for supporting the project. Of course, I did have some leeway for spends and I could have chosen some commercial tools, but the long-drawn process that usually accompanies a procurement was a reason enough to consume free tools in the first iteration. We decided to look into commercial tools with significantly better coverage in subsequent iterations.
The following are some of the common underpinning of the design of the solution.
- Focus on readily integrated tool rather than the best tool for the purpose, keeping configuring of tools to a minimum so that developers don’t spend time in configuring a tool that does not help them deliver their bread winner — their application.
- Focus on keeping costs as close to zero. Therefore, some service selection decisions may raise an eyebrow!
- Focus on defence in depth for protection. Some security findings occur across multiple controls. While this may look like noise, a bit of over communication was considered acceptable.
- Focus to simplify the integration into a CI/CD pipeline and therefore build and release tasks were not separated on Azure DevOps.
All artefact for setting up the security solution is available in my GitHub repo — https://github.com/nth-block/devsecops.
Ring-side view before the deep-dive
Before we deep dive into the actual solution, I will spend some time in explaining the context of the tools that this solution used.
While achieving DevSecOps has been approached in multitude of means, I used this image as the reference from WhiteHat Security as a guide. I loved the simple, yet holistic view it takes on the topic.
In view of the rapid turn around that I was aiming, I chose only a handful of areas that I felt offers the best “bang for the buck”. Hence, I targeted only the highlighted section of the image to address in the first iteration.
The rationale of choosing IDE SAST — Static Application Security Testing in the Development Environment was to ensure that security issues were being flagged in the IDE of the developers so that these vulnerabilities are never committed to the source code repo.
The Build SAST phase was chosen to ensure that any source code level vulnerability that made its way past, or was undetected, in the IDE SAST stage are caught.
Build DAST, Dynamic Application Security Testing during the software build/deployment phase was chosen to ensure that there is at least one runtime analysis of the application to unearth vulnerabilities at runtime.
Build SCA, Software Component Analysis phase was chosen to validate the dependencies of the application was checked for known vulnerabilities.
The final check was to scan the application once the application was once in operation to ensure that any vulnerabilities arising due to configuration drift during operations phase are flagged.
The project’s technology stack was completely based on Microsoft offering. The application itself was an ASP.NET Framework (4.6.1) MVC application, hosted in Azure App Service. The developers were collaborating on the software development using Azure DevOps where the deployment pipelines were also defined. The developers use Visual Studio 2015 for development.
Since this was a proof of concept, this iteration of the security solutions used the classic pipelines in Azure DevOps which aided visual explanation of the Pipeline. The YAML based definitions will work just fine but is not personally tested.
Note: In this article, I have used a generic code to catch some common issues that I face in my role. The code has no proprietary or protected IP.
For enabling the IDE SAST piece, the tool that I zeroed in upon after evaluating a handful of tools, was Security Code Scan, which is available as an open-source project on Github. Refer to the Github Pages link of the project here — https://security-code-scan.github.io/
The tool is extremely simple to add via NuGet Package manger but the flip-side is that every developer was told to add this tool manually into their IDE. However, once this initial hurdle was crossed, the tool is efficient. The tool performs a considerable number of checks in the software to detect vulnerabilities. The image below shows the rules that are available out-of-the-box in Security Code Scan to the right of the image.
Security Code Scan integrates into the IDE and monitors the developer’s code for any vulnerabilities that are accidentally coded and warns the developers of the impending security issue in the warnings section of Visual Studio. See screenshot below where bad usage of cryptography is flagged.
Azure DevOps has a built-in task for leveraging the OWASP Dependency Checker utility for performing a vulnerability analysis in the project’s dependencies at build phase and its configuration is extremely straight forward.
The results was configured to be made available in HTML format. However, other machine friendly formats can also be leveraged for enabling automations. The results are lucid and easy to understand.
The tool of choice for the Build SAST was SonarQube owing to the already available integration of SonarQube with the company’s Azure DevOps platform. While SonarQube is still a maturing product in vulnerabilities detection, this product was best fit while we continued evaluating more efficient tools for SAST scans.
For the PoC, I deployed SonarQube in a container on an Azure VM that was used for other tasks too. The same VM was used for running the DAST tooling too in a container. We will delve into that setup later in this post.
SonarQube credentials was configured as a service connection in Azure DevOps and added to the Build stage of the pipeline
Upon the execution of the build job, the ratings of the code scan in SonarQube is available in the pipeline execution status itself. Furthermore, a detailed report can be obtained by visiting the SonarQube portal where, issues are highlighted and remediation options are also suggested.
In order to perform the DAST scan on the application, a Docker container of the OWASP ZAP in API mode was setup in the same VM that was used to setup the SonarQube instance.
An OWASP ZAP Scan task was added to the post deployment task in the Azure DevOps pipeline. Moreover, a task to persist the results file in an Azure Blob Storage was also added.
In order for the application to be periodically scanned for any drift, a new Azure DevOps pipeline was created with only the OWASP ZAP scan task. The task was scheduled for run at 2AM UTC on every Sunday.
While there is an inefficiency in this pipeline — the code for the application is fetched from the repo while it is not needed at all. We mooted the possibility of writing an Azure Function to trigger the OWASP ZAP scan, the overheads of managing the development and more resources in Azure were reasons enough to justify a small inefficiency in the pipeline.
OWASP ZAP Scan report contains the vulnerabilities that were identified during the scan.
This concluded the set up of the most basic security enhancements to the project’s pipelines.
Setting up of SonarQube (SAST) and OWASP ZAP (DAST) — Docker Compose
NOTE: This task must be done before the setting up of the Azure DevOps pipeline but is presented later in this blog to maintain focus on the process of embedding security controls in DevOps pipelines.
We repurposed a frequently used but not heavily loaded, Azure VM for the task of hosting the SonarQube and OWASP ZAP software. The VM was a Ubuntu VM and we installed Docker along with Docker Compose for orchestrating the services.
Since this was a single node instance, we did not use Docker Swarm or Kubernetes. Also, Azure Kubernetes Service was a candidate but was voted out in favour of a simplistic solution using Docker Compose.
The compose service consisted of three containers — the SonarQube and OWASP ZAP, both were running on the lean version offered in the Docker Hub registry for both the products. Setting up the SonarQube entailed a small configuration but was none too difficult; and we got it right in a couple of attempts.
An NGINX container was used as a reverse proxy to proxy requests to SonarQube or OWASP ZAP API based on the Domain Name in the request. A DNS entry was registered in Azure App Service Domains for SonarQube and OWASP ZAP.
The schematic of the Docker Compose services is below.
Once the VM was all setup with the necessary prerequisites (docker and docker-compose), the starting of the services itself is an easy task.
# docker-compose up -d
The compose file is in the Github repo referenced earlier in this blog.
Observations, learnings and future investment areas
With the limited time at my disposal, it was fairly evident that some of the choices were not optimal. Over the course of expansion of the solution, when dedicated budget becomes available, focus must be on harnessing better SaaS tooling for SAST. Additional area of investment must be in SCA that not only looks for security vulnerabilities but libraries that may impose a licensing restrictions not originally anticipated.
Furthermore, it would be foolhardy to assume that the tests cases that I used in the sample are what we face in the real-world. Therefore, the mileage the of the tools in-production were different.
Summary of Steps
- Have all developers install Security Code Scan in their IDEs.
- Deploy the Docker Compose services in a VM.
- Build the pipeline to include the 5 tasks — IDE SAST, Build SCA, Build SAST, Build DAST and Deployment DAST. Note: The Build SAST for the current tooling has 3 steps to add.
- Configure the DNS names for the SonarQube and QWASP ZAP services.