# SENG 426 - Group 13 - Deliverable 2
Aomi Jokoji
The deliverable of this phase is a report containing:
1. A step-by-step demonstration of the Docker container being deployed with screenshots
2. An explanation of (1) what are Docker Containers and Images, (2) what is the difference of purpose between the Docker and the Docker-compose tools and (3) how is the Neptune app image created including which tools are involved in the process;
3. A step-by-step description of the Jenkins and the GitLab setup with screenshots and an explanation for each configuration choice and action taken;
4. The description of every change made to the application files in order to support the Continuous Integration process, including the Jenkinsfile and the Docker file(s);
5. Screenshots of the SonarQube report and issues pages;
6. Screenshots showing the build history, trends and other Jenkins reports;
7. An example of a notification when a build, scan, deployment or test fails;
8. A commentary on the utility of Continuous Integration (CI) for software development;
9. A small proposal (one or two paragraphs) on how the CI structure could be improved or otherwise augmented in this scenario. For instance, mention how stages could be changed or reorganized and which tools could be added or better used so the CI process is improved;
10. A commentary on the GitHub flow model and a comparison with other commonly used branching models.
# Report
## Docker
#### Creating the NeptuneBank Docker image
```sh
./mvnw -Pprod package jib:dockerBuild
```

#### Deployed MySQL Docker container.
```bash
docker-compose -f src/main/docker/mysql.yml up -d
```

#### Deployed NeptuneBank Docker container.
```sh
docker-compose -f src/main/docker/app.yml up -d
```



### What are Docker Containers and Images
A Docker image is a read-only file that contains all of the source code, dependencies, and tools required for a Docker container to start and run. These images allow for a consistent development environment between different machines and operating systems.
Docker containers are the dynamic copies of the Docker images that are separated from the rest of the system they are deployed on. The isolation provided by docker ensures security for the rest of the machine they are deployed on as they depend on nothing but what was initially deployed in the container.
After the container is deployed, there is a read-write capability of the container on the image that allows for new snapshots of the state of the container to be saved as a new layer of the image. These saved layers of the system are perfect for reverting the container back to a known working state without losing much from the current state.
[Docker Image/Container reference](https://phoenixnap.com/kb/docker-image-vs-container)
### What is the difference of purpose between the Docker and Docker-Compose tools
The Docker tool is mainly used for building, running, and configuring Docker containers, as well as other methods involving management of Docker containers. The docker tool manages Dockerfiles and configures a Docker container and its environment to fit the specification of the Dockerfile. The docker-compose tool is used for defining all of the different components of a system. The docker-compose tool can run the docker tool to create the different containers required for the system and configure them to communicate to each other. The docker-compose tool can help with defining complex stacks for systems.
[Dockerfile/Docker-compose reference](https://www.techrepublic.com/article/what-is-the-difference-between-dockerfile-and-docker-compose-yml-files/)
#### How is the Neptune app image created including which tools are involved in the process.
The Neptune app image is created using the normal build process to produce the Java application. This includes the application development framework [JHipster](https://www.jhipster.tech/), which is a full stack development framework. For the Neptune app, SpringBoot is used.
[Maven](https://maven.apache.org/) is used as the software project management tool. Maven is used to start the build processes etc. Maven is used to kick off the Docker image build process using [Jib](https://github.com/GoogleContainerTools/jib/tree/master/jib-maven-plugin#configuration) which is a plugin for Maven to create Docker images. As the option of `jib:dockerBuild` is specified, the Docker daemon is used to build the image of NeptuneBank.
### Jenkins/GitLab Setup
### Changes to Application Files
#### Jenkinsfile
Changes to the `Jenkinsfile`:
The following code snippet builds the Neptune Bank app in production.
```groovy
pipeline {
agent any
stages {
stage("Maven build") {
steps {
sh "./mvnw -Pprod clean package"
}
}
```
The following code snippet uses SonarQube to scan the project for bugs, vulnerabilities, and duplications. The correct credentials must be used to provide SonarQube with access to the project.
```groovy
stage("SonarQube") {
steps {
sh "./mvnw sonar:sonar -Dsonar.projectKey=seng426-2020-group13 \
-Dsonar.host.url=https://sonarqube.seng.uvic.ca \
-Dsonar.login=10f8b766a9b4917021817a8fe03582a43db4bc6b"
}
}
```
The Selenium integration tests must be run once the application is deployed. For this reason, the deployed .jar file must be moved to a new folder and the integration tests must be run in parallel with the deployment. The failFast flag is set to true to abort a parallel task when one of them fails. The following code snippet from the Jenkinsfile was implemented to move compiled project to a new folder called deploy, and then the integration tests are run on the deployed application.
```groovy
stage("Run Deploy and Integration Tests"){
failFast true
parallel {
stage("Deploy") {
steps {
sh "mkdir deploy"
sh "mv target/neptunebank-app*.jar deploy/neptunebank-app*.jar"
sh "java -jar deploy/neptunebank-app*.jar &"
}
}
stage("Intergration Tests") {
steps {
sh "sleep 45"
wrap([$class: 'xvfb', autoDisplayName: true]) {
sh "./mvnw verify"
}
}
}
}
}
}
```
Once the project has been deployed and tested, the deploy directory that was created for integration testing is deleted and a message is logged to declare the build has finished. Information is also logged regarding the status of the build. Curl is used to notify the team of the build status via Slack.
```groovy
post {
always {
sh "rm -rf deploy"\
deleteDir()
echo "Build Finished!"
}
fixed {
echo "Build completed Successfully"
sh "curl -X POST -H 'Content-type: application/json' --data
'{\"text\":\"The job ${env.JOB_NAME} with build number
${env.BUILD_NUMBER} completed\"}' https://hooks.slack.com/services/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"
sh "curl -X POST -H 'Content-type: application/json' --data
'{\"text\":\" :heavy_check_mark: Passing! More info at
${env.BUILD_URL}\"}' https://hooks.slack.com/services/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"
}
failure {
echo "Build Failed!"
sh "curl -X POST -H 'Content-type: application/json' --data
'{\"text\":\"The job ${env.JOB_NAME} with build number
${env.BUILD_NUMBER} completed\"}'
https://hooks.slack.com/services/T014A2YTQ9X/B014LNNFL5V/w2U6yyRQdMfOgK3mq5L e5ipE"
sh "curl -X POST -H 'Content-type: application/json' --data
'{\"text\":\" :warning: It failed, more info at ${env.BUILD_URL}\"}'
https://hooks.slack.com/services/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"
}
}
}
```
#### Spring Boot Profile - src/main/resources/config/application-prod.yml
The datasource url, username, and password were changed to access the Group 13 server using the correct credentials

Also, the server port was changed to a port designated for Group 13

#### src/main/docker/mysql.yml
**docker-compose**
The MySQL file was updated to use the Group 13 database and credentials and utilized an empty port.

#### src/main/docker/app.yml
**docker-compose**
The Spring datasource was updated to the Group 13 URL.

#### sonar-project.properties
This file sets the properties with which to access SonarQube. The host url, project key, and login hex code were updated to reflect the Group 13 credentials.
```
sonar.projectName=NeptuneBankApp
sonar.projectVersion=1.0.0
sonar.host.url=https://sonarqube.seng.uvic.ca
sonar.projectKey=seng426-2020-group13
sonar.login=10f8b766a9b4917021817a8fe03582a43db4bc6b
```
<!--  -->
### SonarQube
SonarQube is a tool for statically inspecting code quality and providing information about bugs, vulnerabilities, code smells, testing coverage and code duplications. SonarQube was added to the Jenkins CI Pipeline to automatically report this information during development. The following figure shows the information reported by SonarQube.
This figure shows the code quality overview that SonarQube gives.

The following figure gives an overview of the issues found by SonarQube

The following image depicts the top 10 vulnerabilities found as per Open Web Application Security Project (OWASP)

### Build History
Example of history of Jenkins builds:

Example of build trends:

### Notifications


Notifications are handled by using a Slack webhook, and having the post stage in Jenkins send the message through CURL based on the status of the build.
### Utility of CI
CI allows for an increase in quality of code by ensuring every part of code contributed by developers, no matter how small or seemingly inconsequential, is ensured to work properly as part of the larger codebase. By taking the task of merging, testing and reviewing off of developers hands, and allowing it to be run automatically, we speed up the developer process, as well as introducing a standard of quality, as all commits are tested to the same degree. CI also allows for better monitoring of code outside of commits, allowing for better upkeep than could be done with manual testing. For example if the code has errors that take time or multiple runs to occur, CI will be able to find these through regular scheduled testing, and will notify users through the relevant notification channels.
### CI Improvements
Some improvements are still possible to the CI structure, due to our implementation being very basic at this point. Firstly, a better presentation of results could be possible by generating more build artifacts, such as HTML output of test results at the end of the Jenkins run.
Other possible changes to the CI flow are allowing it to give each branch it's own pipeline. Currently, we need to define which branch the Jenkins uses, which in our case is `dev`. A final change that could be made to the overall CI process is adding more plugins. Part of the usefulness of Jenkins is the ability to add in plugins, which would simplify many parts of the current flow. For example, a plugin exists for Slack and GitLab, which would have made the implementation of those parts much faster.
### GitHub/GitLab Flow Model
[Github](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/github-flow)
The Flow Model for GitHub is a simplified version of git, focused more on providing a single deployed version of the code. This is done through a `master` branch, from which `feature` branches are spread from the master, and worked on to achieve a certain capability. For merging back to the master, a branch is put through a pull request, where other team members will review the code, ensuring it meets quality standards. For the Pull Request, as well as on the master branch CI may be integrated. With this, code is automatically checked for correctness before being added to master (preventing erroneous code being added), and master is checked to ensure no issues are generated by the regular running of the code.
[GitLab](https://about.gitlab.com/blog/2014/09/29/gitlab-flow/)
GitLab's flow model is somewhat more complex than GitHub's, focusing on using deployable feature branches rather than a singular deployable master branch. For example, as well as a `Master` branch, we may also have `dev` and `alpha` branches to use before merging with the master branch. This allows for a better deploy method when outside forces may dictate deployment, such as phone app stores. Under the GitLab flow, all commits are tested, and no commits are made to the master branch itself.