# Deploy description for Rules engine (backend)
###### tags: `Evrone`, `Rules engine`, `Jiseki Health`
#### Note: all instructions should be executed from root of [this](https://bitbucket.org/jisekihealth/rules_back_end/src/master/) repo
## Environment variables
In all cases you should set the following environment variables:
* **DOMAIN_NAME** - Backend domain name
* **DJANGO_SECRET_KEY** - Salt for passwords and others. It should be kept in private
* **DJANGO_SETTINGS_MODULE** - Path to Django settings module. It should be equal to `server.settings`
* **POSTGRES_DB** - Name of PostgreSQL DB for Django app
* **POSTGRES_USER** - User of PostgreSQL with ALL RIGHTS for Django app
* **POSTGRES_PASSWORD** - Password for user of PostgreSQL
* **DJANGO_DATABASE_HOST** - PostgreSQL host for connection from Django app
* **SENTRY_DSN** - Sentry key for tracking exceptions
* **AIRFLOW_HOME** - Home directory for Airflow, It should be eqaul to `/code/airflow`
* **AIRFLOW_DB_HOST** - Host for Airflow DB (Metastore). Ideally, it should host of PostgreSQL DB
* **AIRFLOW_DB_PORT** - Airflow DB Port
* **AIRFLOW_DB_NAME** - Airflow DB Name
* **AIRFLOW_DB_USER** - Airflow DB User with ALL RIGHTS
* **AIRFLOW_DB_PASSWORD** - Airflow DB Password
* **C_FORCE_ROOT** - It should be equal True, because Airflow pickle some data inside
* **REDIS_HOST** - Redis host. Redi is used as a message broker for celery (Can be replaced with RabbitMQ, Amazon SQS)
* **REDIS_PORT** - Redis port
* **REDIS_DB** - Redis DB name
* **SPEC_URL** - URL for API specification. It should be equal to `swagger.yaml`
* **AWS_ACCESS_KEY_ID** - AWS Access Key ID for Amazon services (AWS DynamoDB, AWS Amazon SQS)
* **AWS_SECRET_ACCESS_KEY** - AWS Secret Key
* **AWS_DEFAULT_REGION** - AWS default region
* **AWS_RULE_EXPIRATION_TABLE_NAME** - Name of table for storing user expiration period (in AWS DynamoDB)
* **AWS_ACTION_EXECUTION_TABLE_NAME** - Name of table for storing action execution entries (in AWS DynamoDB)
* **LOCAL_CONNECTION_TO_DYNAMODB** - It should be equal to `False` in production
* **MEMBERSHIP_API_URL** - Memebership API URL
## Single Node Cluster
For single-node cluster, you need machines for **PostgreSQL**, **Web-app**, **Airflow** and **AWS DynamoDB**
* **PostgreSQL** - any latest version
* **Web-app**:
* *Build*: `docker build --build-arg DJANGO_ENV=production -t web-app -f ./docker/django/Dockerfile .`
* *Run*: `docker run --env-file <PATH_TO_ENV_FILE> web-app run_web_app` - it will run **Django web application** on **uvicorn server** in **WSGI mode**. UNIX socket path: `/socket/rule.socket`
* **Airflow**:
* You should set the following parameters in `airflow/airflow.cfg`:
* *line 69*: **executor = LocalExecutor**
* line 74: **sql_alchemy_conn = postgresql+psycopg2://${AIRFLOW_DB_NAME}:\${AIRFLOW_DB_PASSWORD}\@${AIRFLOW_DB_HOST}:\${AIRFLOW_DB_PORT}/\${AIRFLOW_DB_NAME}** (or you change it in case of using another DB)
* *Build*: `docker build --build-arg DJANGO_ENV=production -t web-app -f ./docker/django/Dockerfile .`
* *Run*: `docker run --env-file <PATH_TO_ENV_FILE> web-app run_airflow_single_node` - it will run **Airflow scheduler**, **Airflow webserver** and **Airflow worker** on one machine
* **AWS DynamoDB**:
* You can setup locally or use AWS Cloud
Finally, you should have something like this:

## Multi-Node Cluster
For multi-node cluster, you need machines for **PostgreSQL**, **Web-app**, **Airflow Master Node**, **Airflow Workers**, **Message broker** (Redis, RabbitMQ, Amazon SQS) and **AWS DynamoDB**
* **PostgreSQL** - any latest version
* **Web-app**:
* *Build*: `docker build --build-arg DJANGO_ENV=production -t web-app -f ./docker/django/Dockerfile .`
* *Run*: `docker run --env-file <PATH_TO_ENV_FILE> web-app run_web_app` - it will run **Django web application** on **uvicorn server** in **WSGI mode**. UNIX socket path: `/socket/rule.socket`
* **Airflow**:
* You should set the following parameters in `airflow/airflow.cfg`:
* *line 69*: **executor = CeleryExecutor**
* line 74: **sql_alchemy_conn = postgresql+psycopg2://${AIRFLOW_DB_NAME}:\${AIRFLOW_DB_PASSWORD}\@${AIRFLOW_DB_HOST}:\${AIRFLOW_DB_PORT}/\${AIRFLOW_DB_NAME}** (or you change it in case of using another DB)
* *line 499*: **broker_url = redis://\${REDIS_HOST}:\${REDIS_PORT}/\${REDIS_DB}** (or you can change it in case of using another broker, e.g. Amazon SQS)
* *line 507*: **result_backend = db+postgresql://\${AIRFLOW_DB_NAME}:\${AIRFLOW_DB_PASSWORD}@\${AIRFLOW_DB_HOST}:\${AIRFLOW_DB_PORT}/\${AIRFLOW_DB_NAME}**
* *Build*: `docker build --build-arg DJANGO_ENV=production -t web-app -f ./docker/django/Dockerfile .`
* *Run*:
* *Master node*: `docker run --env-file <PATH_TO_ENV_FILE> web-app run_airflow_master_node` - it will run **Airflow scheduler** and **Airflow webserver**
* *Worker*: `docker run --env-file <PATH_TO_ENV_FILE> web-app run_airflow_worker` - it will run **Airflow (Celery) worker**
* **Message broker** - you can use Redis, RabbitMQ or Amazon SQS (depending on your choice you should make corresponding changes in `line 499` of `airflow/airflow.cfg`)
Finally, you should have something like this:
