# 04.5-Security and DevOps Lesson5:CI/CD ###### tags: `Udacity` # 01 CI/CD Introduction **What is DevOps?** DevOps is the combination of industry best practices, and set of tools that improves an organization’s ability to: - Increase the speed of software delivery - Increases the speed of software evolution - Have better reliability of the software - Have scalability using automation, - Improved collaboration among teams. **In other words, these tools enable a company to showcase industry best practices in software development.** In the DevOps model, development and operations teams are merged into a single team. These DevOps teams use a few tools and best practices to accomplish their goals efficiently. Some of these best practices are: - Continuous Integration / Continuous Delivery (CI/CD) - Microservices - Infrastructure as Code (IaC) - Configuration Management and Policy as a Code - Monitoring and Logging - Communication and Collaboration > 有關於Infrastructure as Code的歷史演變可以點這裡[參考](https://medium.com/kkstream/infrastructure-as-code-%E5%A6%82%E4%BD%95%E6%94%B9%E5%96%84%E6%88%91%E5%80%91%E7%9A%84%E7%94%9F%E6%B4%BB%E5%93%81%E8%B3%AA-ee11e9d67b71)我覺得說明的很棒 **What is CI/CD?** CI/CD stands for Continuous Integration/Continuous Delivery , and it is essentially a consistent and automated way for a DevOps team to build, package, and test applications. - Continuous Integration means newly developed code changes of a project are regularly built, tested, and merged to a shared repository like git. - Continuous Delivery is the process of automating the release of the merged and validated code to a repository and finally release a production-ready build to the production environment. > A CI/CD Pipeline under DevOps model to show quick delivery and evolution of software ![](https://i.imgur.com/Ddr09Y9.png) > 有關CI/CD的說明可以[參考這裡](https://samkuo.me/post/2013/10/continuous-integration-deployment-delivery/)非常清楚 **Course Outline** In this lesson, you will learn the various technologies that aid with the process of CI/CD. The following are the objectives of this lesson: - Create a Git repository on GItHub - Introduce the Docker, Container, and Images - Brief about creating instances on AWS, - Configure the Jenkins server to automate the CI/CD pipeline Each of the above-mentioned technologies plays an essential role in the different phases of CI/CD. We will use all of them to setup a CI/CD pipeline on an AWS instance. In this lesson, we will have to use a few third party tools/softwares, such as AWS, Docker, and Jenkins. You will have to bear with us for learning these tools from their official documentation, though we will demonstrate the usage in our videos. {%youtube 7xIcv9ORvWs%} # 02 Git **Git** Git is a free and open source distributed version control system designed to handle everything from small to very large projects with speed and efficiency. {%youtube j4v7SkHFlko%} In this course, we assume that you're already pretty familiar with both Git and GitHub, but if you feel shaky on these skills (or just want to learn more), you can check out this free course: - [Version Control with Git](https://www.udacity.com/course/version-control-with-git--ud123) **Git Branching** {%youtube kYiI4sTGyUw%} # 03 Docker **Docker** Containers OS level virtualization allows us to run multiple isolated user-space instances in parallel. A "container" (or Docker container) is the isolated user-space instance that has the application code, the required dependencies, and the necessary runtime environment to run the application. Containers can dash on heterogeneous platforms. Benefit of Containers Docker images make it easier for developers to create, deploy, and run applications on different hardware and platforms, quickly and easily. Docker has become an essential tool in CI/CD pipeline as it allows software developers a consistent and automated way to build, package, and test applications. Containers share a single kernel and share application libraries. Containers cause a lower system overhead as compared to Virtual Machines. Refer to the [Docker documentation](https://docs.docker.com/) for more information. {%youtube E2QjRCyCo8c%} **Installing and Using Docker** We will deploy our application on a Virtual Machine (VM) on Amazon Web Service (AWS) cloud. The VM that we will use would be a Linux machine having the Docker package already available on it. The following are the commands to install Docker on a Linux machine: ```tiddlywiki # download and install Docker sudo yum install docker # add a user group to Docker sudo usermod -a -G docker $USER # start Docker service sudo service docker start ``` Alternatively, if you wish to install Docker locally, you can refer to the official ["Download and Install"](https://docs.docker.com/desktop/) section. The following are basic commands used with Docker: - `docker build .` will run the Dockerfile to create an image. A Dockerfile is a text file that contains commands as a step-by-step recipe on how to build up your image. In our case, we would not use a Dockerfile because we will use a pre-created `jenkinsci/blueocean` image to instantiate a container. For more details about Dockerfile, refer the [Build and run your image page](https://docs.docker.com/get-started/part2/). - `docker images` will print all the available images - `docker run {IMAGE_ID}` will run a container with the image - `docker exec -it sh` to attach to a container - `docker ps` will print all the running containers - `docker kill {CONTAINER_ID}` will terminate the container There are many more commands that are beyond the scope of this exercise, however, you can refer to this [Reference manual](https://docs.docker.com/engine/reference/commandline/docker/) anytime. We will see the instructor using a few of these commands in the final consolidated demonstration on "Bringing it all Together" page. **Key Terms - Docker** This is additional learning so that you stay aware of key terminologies of Docker. ![](https://i.imgur.com/6TfHE4q.png) # 04 AWS **Amazon Web Services (AWS) - Introduction** For our eCommerce application, we recommend deploying your code on cloud infrastructure because cloud infrastructure (servers, storage, networks, supporting applications, and services) is auto-scalable, accessible broadly, and measurable for "charge-per-use" basis. Amazon Web Services (AWS) is a cloud service provider. At this point, we may think of a "cloud" as a geographically distributed set of data centers that host different Virtual Machines (VMs) from different users. AWS has categorized the entire world into several geographical regions. Each region has many availability zones, which in turn, comprises one or more data centers. Each data center has hundreds of servers, each of which hosts thousands of VMs dynamically. {%youtube 3PK5Nv4Fe5w%} AWS Setup Instructions 1. Create an AWS Account - Open a free-tier AWS account (if you don't already have one) following the instructions via the [Amazon Web Service Help Center](https://aws.amazon.com/premiumsupport/knowledge-center/create-and-activate-aws-account/) 2. Learn to Launch, and Connect to your Elastic Compute Cloud (EC2) Instances - EC2 instances are the virtual machines (VMs) with user-defined configuration. Try launching a sample EC2 instance by following the instruction given in the official documentation. You can launch an instance in 7 simple steps, as follows: - Choose an Amazon Machine Image (AMI) - An AMI is a template used to create a VM. AMI contains the pre-installed operating system, application server, and applications required to launch your instance. In this lesson, we will use Amazon Linux AMI 2018.03 while launching the instance. - Choose an Instance Type - Instance Type offers varying combinations of CPUs, memory (GB), storage (GB), types of network performance, and availability of IPv6 support. AWS offers a variety of Instance Types, broadly categorized in 5 categories. You can choose any one of those types supported by a free tier account. - Configure Instance Details - Provide the instance count and configuration details, such as network, subnet, behavior, monitoring, etc. - Add Storage - You can choose to attach either SSD or Standard Magnetic drive to your instance. - Add Tags - A tag serves as a label that you can attach to multiple AWS resources, such as volumes, instances, or both. - Configure Security Group - Attach a set of firewall rules to your instance(s) that controls the incoming traffic to your instance(s). - Review - Review your instance launch details before the launch. **Additional Knowledge - AWS CodePipeline, AWS CodeBuild, and AWS CodeDeploy** As a part of our project, we will be using Jenkins Pipeline We want you to be aware of alternative Pipeline options. AWS offers a CI/CD pipeline service, named as “CodePipeline”. This service can be used to model, visualize, as well as automate the steps required to release software. AWS CodePipeline enables us to increase the speed and quality of our development. It also runs a set of quality checks to ensure consistency. AWS CodeBuild and AWS CodeDeploy are the two most relevant services that are useful as a part of DevOps. [AWS](https://aws.amazon.com/) offers a plethora of other services, and you can see an optional implementation guide to [Set Up a CI/CD Pipeline on AWS](https://aws.amazon.com/getting-started/projects/set-up-ci-cd-pipeline/?trk=gs_card). We will not be following the exact same steps to deploy our eCommerce application. However, this extra knowledge would help visualize the deployment of high scale enterprise applications. # 05 Connect to Linux/Ubuntu base EC2 Instance using SSH/Putty You can to connect to your Linux/Ubuntu EC2 instance using the authentication key generated by AWS. The steps for Mac/Linux/Windows users are mentioned below: **Linux/Mac users** Unix/Linux/Mac users can log into your EC2 instance using Secure Shell (SSH) client. The following are the steps: 1. Open an SSH client. 2. Locate the private key .pem file in your local machine, and change the permissions to hide it from the public for SSH to work, using the command chmod 400 `<path of private key .pem file locally>` 3. The default username for Ubuntu VMs is `ubuntu`, and for Linux, it is `ec2-user`. The list of default usernames is available [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/connection-prereqs.html). Go to your AWS dashboard, and note the public IP address or public DNS of your EC2 instance. See the snapshot below to determine the public IP and DNS of your EC2 instance. ![](https://i.imgur.com/Uim8ppB.png) ![](https://i.imgur.com/JVifvT9.png) 4. Connect to your instance using its public IP address, using `ssh <username>@<IP address> -i <path of private key .pem file locally>` in your terminal. Alternatively, you can use the public DNS, as `ssh -i <path of private key .pem file locally> <public DNS>`. The successful login will show you a prompt as shown in the snapshot below: ![](https://i.imgur.com/fNxLJQM.png) **Windows users** Windows users can log in using PuTTY. The following are the steps: 1. Download, and install PuTTY and PuTTYgen utility in your machine. Go [here](https://www.chiark.greenend.org.uk/~sgtatham/putty/latest.html) to download the utilities. ![](https://i.imgur.com/2jjrdGx.png) 2. You will need to have your PuTTY Private Key (.ppk) file locally. For this reason, we will use PuTTYgen utility to convert the .pem file to .ppk format. - Run the PuTTYgen in your machine. - Click Load, as shown in the snapshot below. - Browse and load the .pem key into the PuTTYgen and then click on Save private key to save the key in .ppk format without a passphrase. ![](https://i.imgur.com/2GeHMZT.png) ![](https://i.imgur.com/Y6U3xjv.png) 3. Run the PuTTY to connect to the EC2 instance. 4. Get the public DNS / IP address of your EC2 instance (host) from your AWS EC2 dashboard. Then, enter the public DNS / IP address of your EC2 instance (host) into the PuTTY . See snapshot below. ![](https://i.imgur.com/pYYmJAw.png) 5. Click on Connection → Data to enter the default username as `ubuntu` for Ubuntu, or `ec2-user` for Linux based OS. The list of default usernames are listed [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/connection-prereqs.html) 6. Click on Connection → SSH → Auth to upload the .ppk file. See the snapshot below: 7. Click on Connection → Data to enter the default username as ubuntu. 8. Click on Connection → SSH → Auth to upload the .ppk file. See the snapshot below: ![](https://i.imgur.com/vNir2yC.png) 9. If everything is set up correctly, a terminal window would open up to log you into your Linux/Ubuntu AMI based EC2 instance. # 06 Monitoring you AWS cists and credits **Monitoring your AWS costs** All AWS services are a pay-as-you-go service, so we urge our students to closely monitor their usage costs and if they have adequate credits available to complete their project/task. Follow the instructions below to do that: Step 1. Log into your [AWS account](https://console.aws.amazon.com/). Step 2. Examine your costs Go to https://console.aws.amazon.com/billing/ You should see the following billing dashboard where it will show your costs. ![](https://i.imgur.com/ihvIvwS.png) Step 3 (optional). Check the value of your credits. Click on the "Credits" from the left side menu and the following screen will show with your available credits. ![](https://i.imgur.com/2YVrmmg.png) ![](https://i.imgur.com/bN9Spdw.png) # 07 Jenkins **Jenkins - Introduction** Jenkins is an open-source automation server written in the Java programming language. Jenkins helps to automate a few aspects related to building, testing, and delivering or deploying software. There are many more other frameworks available in the market, as listed [here](https://en.wikipedia.org/wiki/Comparison_of_continuous_integration_software). In our case, we need to integrate Jenkins with AWS to trigger the CI/CD pipeline automatically whenever code changes pushed to our GitHub repository. This automation would require us to link our Git repository to Jenkins. More information about Jenkins is available [here] (https://www.jenkins.io/zh/doc/#doc/pipeline/tour/getting-started#). Refer the diagram below to understand where does the Jenkins fit into the overall system. ![](https://i.imgur.com/5joWPhQ.png) > A CI/CD Pipeline showing the role of Jenkins automation server taking care of three different environments - Development, Staging, and Deployment. Ansible is useful only in large scale applications. {%youtube FINqfPE5C0E%} **Jenkins Pipeline** Jenkins Pipeline is a set of plugins that assist in achieving continuous delivery. Jenkins is highly modular and supports a multitude of plugins. Plugins extend Jenkins with additional features to support various requirements. **Prerequisite to Install Jenkins** You can install Jenkins using native system packages, Docker, or run on any machine having a Java Runtime Environment (JRE) available on it. But in our case, Jenkins will be automatically installed on an AWS EC2 instance when we will instantiate a container using a pre-created image [jenkinsci/blueocean](https://hub.docker.com/r/jenkinsci/blueocean/) by running `docker run` command in our demo. Once the container is instantiated, we will access (from the browser) the EC2 instance to configure the Jenkins Pipeline. You can refer to "Logging into Jenkins using GUI" section below that we will demonstrate in the video on the next page "Bringing it all together". **Additional Knowledge - What is `Jenkinsfile`?** A Jenkins Pipeline is defined using a text file called `Jenkinsfile`. This text file is stored in the application’s Source Control Repository e.g., Github. Storing the `Jenkinsfile` into a source control repository makes it possible to review and audit collaboratively. A `Jenkinsfile` can be written using either of the two types of syntax - Declarative and Scripted. The following is an example file: ```java Jenkinsfile (Declarative Pipeline) pipeline { agent { docker { image 'maven:3.3.3' } } stages { stage('build') { steps { sh 'mvn --version' } } } } ``` **Additional Knowledge - Install Jenkins on Linux Instance** The following are the steps to install Jenkins, as mentioned in the official Reference manual for installing Jenkins on Linux machines, if you wish to do it locally. ```tiddlywiki # Step 1 - Update existing packages sudo apt-get update # Step 2 - Install Java sudo apt install -y default-jdk # Step 3 - Download Jenkins package. # You can go to http://pkg.jenkins.io/debian/ to see the available commands # First, add a key to your system wget -q -O - https://pkg.jenkins.io/debian/jenkins.io.key | sudo apt-key add - # # Step 4 - Add the following entry in your /etc/apt/sources.list: sudo sh -c 'echo deb https://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list' # # Step 5 -Update your local package index sudo apt-get update # Step 6 - Install Jenkins sudo apt-get install -y jenkins # Step 7 - Start the Jenkins server sudo systemctl start jenkins # Step 8 - Enable the service to load during boot sudo systemctl enable jenkins sudo systemctl status jenkins ``` The `sudo` command allows us to run a command as root. The `apt` utility is for installing software. It is a package manager and performs dependency resolution to install the supporting libraries for the end user. A package manager facilitates easy installation, upgrading, and management of software on a computer. The -y option for apt enables it to automatically select the “yes” option to install the software and prevents prompting of the end user. You will observe in the step which adds “jenkins.list” that we are adding an additional software repository source from where our Linux computer will be able to install the software. This is referred to as an Apt Package Repo. - sudo 表示做為一個root去執行指令因為Linux 系統最高權限的管理者帳號為 root - apt 是專門安裝軟體的工具[參考](https://dreamtails.pixnet.net/blog/post/24821908#:~:text=%E6%AD%A5%E9%A9%9F%EF%BC%9A(%E5%89%8D%E6%8F%90%E6%98%AF%E6%82%A8%E7%9A%84,%E5%99%A8%E7%9A%84%E5%A5%97%E4%BB%B6%E6%AA%94%E6%A1%88%E6%B8%85%E5%96%AE%E3%80%82) - `-y` 可以讓apt自動選擇`yes`選項去安裝軟體就不會一直跳出提示 **Logging into Jenkins (GUI) installed on EC2 Instance** These steps will help you follow the consolidated video on the next page. - Go to the AWS dashboard to copy the public IP address of your Linux EC2 instance. - Paste the public IP address into your browser, appended with `:8080` port. For the first time, it will open up the Jenkins GUI as shown in the snapshot below: > Jenkins GUI saves the administrator password in a file on the server ![](https://i.imgur.com/sNRsy9y.png) - On the terminal, where you have connected to the Linux EC2 instance, and entered the shell into Jenkins using docker exec -it jenkins bash, view the content of the file using the command sudo cat `<path copied in the previous step>`. It will show the default administrator password. You can copy and use this password in the GUI (browser) to log in the first time. - After successful login, you may choose to install default plugins. Though, we will learn to use specific plugins for our needs in the next lesson. See the snapshot below. ![](https://i.imgur.com/KKkDh0W.png) - If you choose to install suggested plugins, the following plugins would get installed. See the snapshot below: ![](https://i.imgur.com/Bt2zokI.png) - Set up the user credentials. See the snapshot below: ![](https://i.imgur.com/YfUCERA.png) - Next, it will show you a success message and take you to the Jenkins dashboard. ![](https://i.imgur.com/5bPZac9.png) ![](https://i.imgur.com/QPLxi42.png) **Recommended Read** Make yourself familiarized with the Jenkins Pipeline implementation by following along below guided tutorials, in order to have a better understanding of the demonstrations in the upcoming videos. - [Creating your first Pipeline](https://www.jenkins.io/doc/pipeline/tour/hello-world/) - [Build a Java app with Maven](https://www.jenkins.io/doc/tutorials/build-a-java-app-with-maven/#stopping-and-restarting-jenkins) # 08 Bringing it all Together **Bringing it all together** The following steps are demonstrated in the video that follows: 1. Launch an AWS EC2 instance that will be built using `Amazon Linux AMI 2018.03 AMI`. This EC2 instance would have 2. Connect to the EC2 instance using SSH (for LinuxMac users) or Putty(Windows users). In the EC2 instance, do the following: 1. Install Docker for Linux 2. Add a user group to Docker 3. Reboot the EC2 instance 3. Connect to the EC2 instance again, as done in the previous step. 1. Start the Docker services 2. Use a pre-created `jenkinsci/blueocean` image to instantiate a container. This container would have Jenkins already installed on it. 3. Work in a shell into Jenkins to install Maven, and generate ssh keys pair (private-public) 4. Log into Jenkins GUI, on EC2 Instance, from your local computer's browser. 1. Configure the Jenkins global credentials 5. Go to your Github repository 1. Add the public key, by going to Settings --> Deploy Keys in your repo. 6. In your Jenkins GUI, create a new job to trigger a build whenever there is a change in your Github repo. This step would require you to: 1. Link your Github repo URL to the Jenkins. 2. Add the path of `pom.xml` available in your Github repo to the build configuration of Jenkins. 3. Also, add the goals in the build configuration. 4. Finally, click on "Build Now" in the Jenkins GUI. {%youtube fNB2EFivAwY%} The commands used in the video above are: ```tiddlywiki # update the existing packages sudo yum update # download and install Docker sudo yum install docker # add a user group to Docker sudo usermod -a -G docker $USER sudo reboot ``` Close the connection to the EC2 instance, and connect back again using SSH/Putty. Run the following commands to start the docker services and instantiate a container from a pre-created image `jenkinsci/blueocean` by running `docker run` command: ```java # start Docker service sudo service docker start # Run the jenkinsci/blueocean image as a container docker run --rm -u root -d --name jenkins -p 8080:8080 -v jenkins-data:/var/jenkins_home -v /var/run/docker.sock:/var/run/docker.sock jenkinsci/blueocean ``` If you ever need, you can stop the docker services using `sudo service docker stop` anytime. Next, work in a shell into Jenkins using the following command: ```xml # open a shell into Jenkins docker exec -it jenkins bash # install Maven apk add maven # generate ssg keys ssh-keygen -t rsa # view the private key cat /root/.ssh/id_rsa ``` The next step is to log in to Jenkins using GUI (browser), as mentioned on the previous page under "Logging into Jenkins (GUI) installed on EC2 Instance" section. You'd have to go to the AWS dashboard to copy the public IP address of your Linux EC2 instance, and then paste the public IP address into your browser, appended with `:8080` port. ```xml # view the initial admin password to be used to unlock Jenkins in the GUI. It resides in the shell into Jenkins cat /var/jenkins_home/secrets/initialAdminPassword ``` Copy and paste the admin password from the shell into Jenkins (on EC2 instance) to the Jenkins GUI (browser). Later in the demo, you will need to run the following command as well: ```xml # view the pubic key stored in the shell into Jenkins cat /root/.ssh/id_rsa.pub ``` {%youtube -4gG1POAdZA%} If you're interested in going further and having your build deployed to an environment, you can find a great description of how to do so in this [white paper](https://d1.awsstatic.com/whitepapers/AWS_Blue_Green_Deployments.pdf) from AWS. **Conclusion** In this lesson, you have learned the various technologies that support the process of CI/CD. We have covered the following objectives: - Create a Git repository on GItHub - Introduce the Docker, Container, and Images - Brief about creating instances on AWS, - Configure the Jenkins server to automate the CI/CD pipeline We have used all of them to set up a CI/CD pipeline on an AWS instance. # 09 Course Recap **Course Recap** Congratulations on getting to the end of the course! In making it this far, you've covered a lot of new skills, including: - How to secure enterprise applications using proper hashing and salting - How to use JWTs for authorization and authentication - The importance of different types of testing - How to use unit testing via JUnit - The importance and usefulness of mocking with Mockito - How to log effectively - How to use logging information to create dashboards and visualizations in Splunk to analyze, debug, and diagnose your application - And finally, you've tied this together by learning how to create a CI/CD pipeline with Jenkins, and then deploy the result using AWS and Docker - These are key skills for modern Java application development. Next, you'll have the opportunity to demonstrate these new skills by completing the eCommerce application and submitting it as your final project in the course.