# Setting up cloud VM for Overlay's risk codes This guide will provide a walkthrough for setting VM instance to run Overlay's risk-related codes. As a prerequisite, access to an VM instance is needed with e.g. 2 vCPUs and 8 GB RAM. We will assume for example that a [Google Cloud Compute](https://cloud.google.com/compute/) account is available and that the user has created an Ubuntu 20.04 VM instance as seen below in the screenshots ![](https://i.imgur.com/bM2NMID.png) ![](https://i.imgur.com/Fe0mG7n.png) To access the VM, an ssh keypair can be generated via `ssh-keygen -t rsa -b 4096`. Then the public key should be copied into the VM security settings ![](https://i.imgur.com/mQxW7eZ.png) After creating and initializing the instance, the user can connect via a terminal using the command `ssh user@IPADDRESS`, where IPADDRESS is the VM's IP address and the username `user` was chosen during the ssh keypair generation. ### Python and poetry installation Upon successful connection, the first step involves installing Python 3.9 and related libraries ``` bash sudo apt install software-properties-common sudo add-apt-repository ppa:deadsnakes/ppa sudo apt update sudo apt install python3.9 python3.9-venv python3.9-dev cython ``` Then we need to install poetry ``` bash curl -sSL https://install.python-poetry.org | python3.9 - ``` Make sure to add poetry to your PATH `export PATH="/home/user/.local/bin:$PATH"` ### Ganache installation At the next step, we install nvm and use it to install the latest nodejs ``` bash curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh | bash ``` Then start a new shell e.g. via `bash` and install nodejs using ``` bash nvm install node ``` Ganache can now be installed ``` bash npm install -g ganache ``` ### InfluxDB installation We also need to install InfluxDB ``` bash wget -qO- https://repos.influxdata.com/influxdb.key | gpg --dearmor | sudo tee /etc/apt/trusted.gpg.d/influxdb.gpg > /dev/null export DISTRIB_ID=$(lsb_release -si); export DISTRIB_CODENAME=$(lsb_release -sc) echo "deb [signed-by=/etc/apt/trusted.gpg.d/influxdb.gpg] https://repos.influxdata.com/${DISTRIB_ID,,} ${DISTRIB_CODENAME} stable" | sudo tee /etc/apt/sources.list.d/influxdb.list > /dev/null sudo apt-get update && sudo apt-get install influxdb2 ``` Then the InfluxDB service can be started ``` bash sudo service influxdb start ``` and we can setup the database via `influx setup`, as seen in the screenshot below (make sure to use "ethereum" for organization and "ovl_sushi" for bucket name) ![](https://i.imgur.com/QvpJIhN.png) To connect to the database, a token is required which can be retrieved using the command `influx auth list` (the token will be an 88-character string ending with "=="). Using this we can set the following environment variables (replace the token value) ``` bash export INFLUXDB_TOKEN= export INFLUXDB_ORG=ethereum export INFLUXDB_URL=http://localhost:8086/ ### Etherscan and Infura API keys We need to set two more environment variables for the [Etherscan API](https://etherscan.io/myapikey) and the [Infura API](https://infura.io/). These can be retrieved as seen in the screenshots ![](https://i.imgur.com/mV7qauv.png) ![](https://i.imgur.com/fAUwhJQ.png) Replace the values with your own API keys. ``` bash export ETHERSCAN_TOKEN= export WEB3_INFURA_PROJECT_ID= ``` ### Overlay risk codes installation Finally, we clone the overlay repo and install using poetry. ``` bash git clone https://github.com/overlay-market/overlay-risk.git cd overlay-risk poetry install ``` Upon succesful installation, we can test that everything is working by running the script ``` bash poetry shell brownie run influx_sushi --network mainnet ``` ### Crons To save on gas costs for this risk analysis, there are cron schedulers to run Brownie scripts every 10 minutes, fetching cumulative price values from SushiSwap and uploading them to InfluxDB for easy-to-access historical timeseries. To setup the cron for e.g. fetching from SushiSwap, simply run from the base dir ``` bash poetry shell python scripts/cron/schedule_sushi.py ``` which will run every 10 minutes storing new cumulative price data from all quotes in `scripts/constants/quotes.json`.