# Ads docs design on Airflow DAG ###### tags: `Airflow` `Cluster` ## DAG: `ads_gen_latest_docs` - `train.analyze.analyze.FlowGenAdDocPiece` - schedule: 每週? to be considered - output: ![](https://i.imgur.com/5QQYsL9.png) ``` # Content of result.json { sil_score: ..., prediction: [0, -1, ...] df_result: { ... } } ``` ![](https://i.imgur.com/zq4bP7P.png) ## DAG: `ads_cont_eval` - `train.analyze.analyze.FlowDoPPLOnClusteredDocs` - schedule: 雙週? - **Training every Friday ...** - output: ![](https://i.imgur.com/sI0g5ny.png) Content ![](https://i.imgur.com/ZPAXE2A.png) ## Build image at local Two ways to build docker image 1. Run `./{project}/gitlab.sh`, this mock gitlab CI/CD workflow to build docker image 2. Run `make build_train`, compare to no.1 this is not rather regular way for building image, because it may include some local files not in Git. ```shell gitlab-runner exec docker staging-build-train \ --docker-privileged \ --docker-volumes '/home/jupyter/gpt2-chinese/tmp:/tmp' \ --env DEV_MODEL_ID=20210722 \ --env GCP_DEV_PROJECT_ID=ai4ad-dev \ --env GCP_DEV_SERVICE_KEY=/tmp/service_account.json \ --env PROJECT_DEV_SERVICE_ACCOUNT="gpt2-chinese-stg@ai4ad-dev.iam.gserviceaccount.com" \ # --env PRD_MODEL_ID=20210618 \ # --env GCP_PROJECT_ID=ainotam-production-1 \ --env TRAIN_IMAGE_NAME=gpt2-chinese-train \ # --env PREDICT_BOTH_IMAGE_NAME=gpt2-chinese-predict-both \ # --env GCP_SERVICE_KEY=/tmp/service_account.json \ # --env GCP_BQ_SERVICE_KEY=/tmp/service_account_bq.json \ # --env PROJECT_SERVICE_ACCOUNT="gpt2-chinese-production@ainotam-production-1.iam.gserviceaccount.com" \ --env CI_DEBUG_TRACE=true \ --env CI_COMMIT_SHORT_SHA="$(git rev-parse --short HEAD)" \ --env CI_COMMIT_TAG="manual" \ --pre-clone-script "" ``` ## Run Airflow with docker Run this command in folder `loupe-airflow` `docker-compose -f {project}/docker-compose.local.yml up -d` ## Repo & branch - GPT2-Chinese: `feature/model_info` - loupe-airflow: `feature/model_eval` ![](https://i.imgur.com/QnyRn7P.png)