# Create Pipeline to sync upstream to downstream.
### Overview

* What tool to use? - PAC
* How easy to maintain/manage them? [.tekton](https://github.com/praveen4g0/upstream/tree/main/.tekton)
* Where do you run your workloads?
* Do we need to manage infra to run our workload? - **No** (we have self managed (psi) cluster with pipelines installed)
### So what we gonna build today?
* We are going to build e2e sync pipeline.
* When push event happens on upstream(github) **-->** we sync them to downstream (gitlab(behind the VPN))
### Prerequistes:
* So we need a system that can listen to events both from public and private vpn, I have choosed [smee.io](https://smee.io/).
* Make sure `openshift-pipelines` installed on your infra(self/managed) it also installs `pipelines-as-code` as well but very old (won't support gitlab provider integration.)
* So we are going to remove default one operator installs and then install upstream stable version.
```
# OpenShift
kubectl patch tektonconfig config --type="merge" -p '{"spec": {"addon":{"enablePipelinesAsCode": false}}}'
kubectl apply -f https://raw.githubusercontent.com/openshift-pipelines/pipelines-as-code/stable/release.yaml
```
* Make sure [gosmee client](https://gist.githubusercontent.com/praveen4g0/54dcdf6ad88684a908cebe4a15503ff4/raw/b925d37cc4a8ab05d29aae42729cb9f4fbcae4e0/gosmee-client.yaml) installed on psi cluster
* Add pipelines-as-code-controller service to gosmee client
```
containers:
- name: gosmee-client
image: 'ghcr.io/chmouel/gosmee:latest'
args:
- 'https://smee.io/1bwpqwUgaMkPYbRZ'
- $(SVC)
env:
- name: SVC
value: http://pipelines-as-code-controller.pipelines-as-code.svc.cluster.local:8080
```
* Make sure `tkn-pac` installed
* Make sure we are on the right cluster
* we are going to extend [pac-demo](https://gist.github.com/praveen4g0/8ee4b74109e242bb70be25ce7de39cb0) now.
* And, optionally we need uploder service like [this](http://uploader-devtools-gitops-services.apps.ocp-c1.prod.psi.redhat.com/), inorder to store test/build artifacts in presistantly.
---
### Pull-request pipeline

When user pushes changes to `dev` branch it triggers pull-request pipeline, does following.
1. `fetch-repository` **-->** Here fetches latest commits that's been pushed to `dev` branch.
2. `run-lint` **-->** Here performs go-lint tests.
3. `build-and-push` **-->** Here we build and push application to image registry (internal/external)
4. `image-scan` **-->** Here we scan build image for vulnerabilities using [RHACS](https://docs.google.com/presentation/d/1fKq5uqJnUO4q_hzaqoWBZxrGIoEIIdgdIxjCpRksrh4/edit#slide=id.gdf6fd387a8_0_0) (installed on infra where we run our workloads)
5. `finally` **-->** Here it does slack notification part.
---
### Push pipeline.

* When user merges `dev` changes to `main` branch of upstream, it triggers push pipeline, which sync both upstream changes merged to `main` branch and downstream specific patches maintained [here](https://gitlab.cee.redhat.com/pthangad/downstream), and apply downstream pacthes to them, raises pull-request automatically against `release-next` branch.
* we create branch names `release-next-ci-$SHA` which will trigger spearate downstream pull-request pipeline upon every change that you make in upstream `main`, well this branch gets deleted automatically upon pipeline succedes
```
git push -o merge_request.create \
-o merge_request.target=$(params.branch) \
-o merge_request.remove_source_branch=true \
-o merge_request.title="[$(params.branch)] Update upstream sources [${sha}]" \
-o merge_request.label="bot" \
-o merge_request.merge_when_pipeline_succeeds \
${OPENSHIFT_REMOTE} $(params.branch)-ci-${sha}
```
* If we run sync pipeline again, if expected branch already exists remotly, we shall ignore sync process(ignore duplicates or empty commits on upstream main)
1. `sync-upstream-to-downstream` **-->** Here this does syncing upstream `main` to downstream `release-next` branch, by applying downstream specific patches maintained in downstream `main` branch.
2. `finally` **-->** Here it does slack notification part.
### Watch slack for notifications [here](https://coreos.slack.com/archives/C0423RS62JX)
---
### You should know
* Openshift-pipelines [here](https://docs.openshift.com/container-platform/4.8/cicd/pipelines/understanding-openshift-pipelines.html)
* pipelines as code [here](https://pipelinesascode.com/)
* Redhat Advanced security for kubernetes [here](https://www.redhat.com/en/technologies/cloud-computing/openshift/advanced-cluster-security-kubernetes)
---
### Q&A