# Value Attribution Monorepo ## **Notice** `pnpm` must be used for all package management -- **DO NOT USE NPM** # Getting Started The entire monorepo can be managed and ran by `pnpm` and `lerna` to faciliate easy package commands, and development monitoring If you do not currently have pnpm installed run `npm i -g pnpm` Lerna does not frequently need to be ran as their are passthrough commands at the root package that call lerna for you. [More About Lerna](https://lerna.js.org/) ## Local Development ### Required Hardhat & Polygon Runtimes [Running Hardhat](#running-hardhat) is required for the `polygon-cf` package to function properly if you are using or interfacing with it through the `api`. ### Required Docker Runtimes If you are running the entire architecture or are working on/with `api`, `redis-cf` or `sql-cf` you will need to run the dockers within `docker` for each service respectively as follows: Open two terminals are run the following commands: - `cd docker/psql` followed by `docker compose up` - `cd docker/redis` followed by `docker compose up` When bringing down the dockers from the directory you must append `--volumes` to the psql docker to reset the database, unless you wish for it to remain persistent [Additional Docker Information](docker/README.md) ### Local Packages To get all packages started locally run the following commands: #### **First Run** The first run needs all dependencies installed and the environment prepared. These tasks can be accomplished with the following commands: - `pnpm i` - Installs dependencies for all monorepo packages - `pnpm prepenv` - Setup the default .env files #### **Subsequent Runs** Subsequent runs after dependency installation and environment setup only require the following commands to be used for development. _In two separate terminals run:_ - `pnpm buildw` - Watch all packages for builds<sup>*1</sup> (Must be ran first) - `pnpm startw` - Start and watch latest builds of runnable packages<sup>*2</sup> ##### *1 Typescript errors and any compilation errors will show up in the terminal running `buildw` ##### *2 Runtime errors will show up in the terminal running `startw` _Note for first time builds_ Occasionally on a first run after pulling a clean repo, local dependencies such as vam-utils will be missed by other compilers due to a dist not being available until the first build runs, you may need to allow `buildw` to complete, stop it, and re-run it. -- The runtime should not be affected. ## Running Hardhat When using the polygon cloud function locally, you must start a local hardhat node and deploy the respective contracts to it as follows: In a two separate terminals run: `pnpm hh-node` - Starts the hardhat node `pnpm hh-deploy` - Deploys necessary contracts to the running hardhat node ### Additional helpful commands `pnpm run ci` - Remove all ts/rollup caches, node_modules and reinstall all deps `pnpm clean-ts` - Just remove ts/rollup caches # Repository Hierarchy > Packages contain additional READMEs to explain their purpose and usage. - [API Docs](packages/api/README.md) - [SQL Cloud Function Docs](packages/sql-cf/README.md) - [Redis Cloud Function Docs](packages/redis-cf/README.md) - [Polygon Cloud Function Docs](packages/polygon-cf/README.md) - [IPFS Cloud Function Docs](packages/ipfs-cf/README.md) - [Storage Cloud Function Docs](packages/storage-cf/README.md) - [Docker Docs](packages/cloud-functions/docker/README.md) #### **/packages** -- contains the following packages: - `api` - User facing API to facilitate uploads,reads,etc - Interfaces with cloud functions - `ipfs-cf` - **DEPRECATED** -- To be switched out for `storage-cf` - `polygon-cf` - Polygon cloud function for uploading/registering data to blockchain - `redis-cf` - Redis cloud function for interfacing with Redis service - `sql-cf` - PostGres cloud function for interfacing with DB - `storage-cf` - To replace ipfs as the storage mechanism for data - `vam-local-secrets` - Contains local dummy secrets for development - `vam-utils` - Contains shared utility functions for all packages #### **/docker** -- contains docker composition files for Docker runtimes - `psql` - PostGresQL docker-compose configurations - `psql/db` - Initialization SQL for the psql docker - `redis` - Redis docker-compose configurations - `dockerScripts` - Scripts for quickly interacting with the Docker containers #### **/tests** -- Contains Full Architecture Tests - Most likely to be reorganized with [Issue36](https://github.com/valence-eng/value-attribution-mono/issues/36) #### **/scripts** -- Helper scripts for various tasks - TBD: Needs refactored and parsed for the remaining beneficial scripts since switch to lerna #### **/.husky** -- Husky configuration folder - (Husky is current disabled during fast development) #### **/** -- contains the following configuration files: - **/.rollup.config.js** -- Root rollup config, all package `rollup.config.js` should build from this - **/.tsconfig.json** -- Root typescript config, all package `tsconfig.json` should build from this - **/.pnpm-workspace.yaml** -- `nx` task config - **/.pnpm-workspace.yaml** -- `pnpm` workspace config -- *should not need updated* - **/lerna.json** -- `lerna` workspace config -- *should not need updated* - **/commitlint.config.cjs** -- `commitlint` config -- *should not need updated* - **/prettierrc.json** -- `commitlint` config -- *should not need updated without consulting team* ## Smart Contracts `smart-contracts/` contains all the smart contracts and their respective tooling for development, test and deployment. The smart contracts are written in Solidity and are compiled using Hardhat. For more information, please see the [README](smart-contracts/README.md) in the `smart-contracts` directory. ### Requirements #### Local - Node >= 18.0.0 - Redis >= 6.0.0 - PostgreSQL >= 16.0 #### Remote - Node >= 16.0.0 - Redis >= 6.0.0 - cloud-sql-proxy >= 1.27.0 - gcloud >= 425.0.0 - Google Cloud Platform Account - Google Cloud IAM Permissions #### Production - gcloud >= 425.0.0 - Google Cloud Platform Account - Google Cloud IAM Permissions ## ### Deployment Deployment of packages is handled by [Gcloud](https://cloud.google.com/sdk/docs/install#linux) Once you have installed Gcloud you need to login to your Google Cloud account: `gcloud auth login` Next, you need to set the project you want to deploy to: `gcloud config set project datty-atty` #### Running & Development Supplemental tooling is in place to alleviate the amount of terminals and starting/stopping of services ##### Running the database and redis locally You can run the docker containers with the database (postgres) and redis locally with the following command from the root project directory: ```bash pnpm run docker-up ``` If you want to delete the volumes and start fresh you can run the following command from the root project directory: ```bash pnpm run docker-down ``` ##### Running Locally Once supplemental or local services required are running you can use the services command to bring up an environment to watch for changes locally to code and redeploy cloud functions locally for development From the root project directory: `npm run services -- "local=[api,sql,redis,ipfs,polygon]"` You can run services as needed for development ##### Deployment To deploy a cloud function you must first build the function. To build a cloud function run `npm run build` in the package directory. Now you can now deploy the function with `gcloud functions deploy {function-name} --gen2 --runtime nodejs16 --trigger-http --allow-unauthenticated --entry-point {function-name}` To add enviorment variables add `--set-env-vars {env-var-name}={env-var-value}` to the deploy command. You can add multiple enviorment variables by adding `--set-env-vars {env-var-name}={env-var-value},{env-var-name}={env-var-value}` #### Setting up the database on cloud In order to update the database schema on cloud you must always run the following command from the root project directory before starting the app: ```bash pnpm run migrate ``` ## TODO - [ ] Typed Data - [ ] Data Validation - [ ] Data Sanitization - [ ] Secure connection between API and Cloud Functions - [ ] Add Private IP / VPC to Cloud Functions, Cloud Run API, Cloud SQL, and Cloud Memorystore - [x] Consistent Network Requests and Responses - [ ] Documentation - [ ] API - [ ] Redis Cloud Function - [ ] IPFS Cloud Function - [ ] SQL Cloud Function - [ ] Polygon Cloud Function - [ ] Tests - [ ] API - [ ] Redis Cloud Function - [ ] IPFS Cloud Function - [ ] SQL Cloud Function - [ ] Polygon Cloud Function - [ ] Scripts - [ ] Spin up local dev environment - [ ] Spin up remote dev environment - [ ] Spin up prod connected environment - [ ] Deployments - [ ] Deploy to remote dev - [ ] Deploy to prod ### API - [ ] Enviroments - [x] Local - [ ] Remote Dev - [ ] Prod - [ ] Dockerfile to run in Google Cloud Run - [x] Crash Recovery - [ ] Rate Limiting - [x] API Key Authentication - [ ] Routes - [ ] /Upload - [ ] /Retrieve - [ ] /Register - [ ] Cloud Function Connectors - [x] Redis - [x] IPFS - [x] SQL - [ ] Polygon - [ ] Event Driven Task Queue - [x] Redis - [x] IPFS - [x] SQL - [ ] Polygon - [ ] Reject Duplicate Uploads ### Cloud Functions #### Redis - [ ] Enviroments - [x] Local - [x] Remote Dev - [ ] Prod - [x] Do CRUD operations - [x] Create - [x] Read - [x] Update - [x] Delete - [x] Manage Queues (Redis Lists) - [x] Add to Queue - [x] Remove from Queue - [x] Get Queue Length - [x] Get Queue Head - [x] Get element position in Queue #### IPFS - [ ] Enviroments - [ ] Local - [x] Remote Dev - [ ] Prod - [ ] Pin Data to IPFS - [ ] Retrieve Data from IPFS #### SQL - [ ] Enviroments - [x] Local - [x] Remote Dev - [ ] Prod - [x] Create Account Database schema - [x] Create API Key Database schema - [x] Create Data Registry Database schema - [ ] Send queries to the database - [ ] Validate Data - [ ] Sanitize Data -- #### Polygon - [ ] Enviroments - [x] Local - [ ] Remote Dev - [ ] Prod - [x] Create Registry Smart Contract - [ ] Send data to smart contract - [ ] Retrieve data from smart contract ## Appendix ### Why Lerna Lerna solves two of the biggest problems of JavaScript and TypeScript monorepos: * Lerna runs a command against any number of projects, and it does it in the most efficient way, in the right order, and with the possibility to distribute that on multiple machines. * Lerna manages your publishing process, from version management to publishing to NPM, and it provides a variety of options to make sure any workflow can be accommodated. For a more comprehensive explanation of Lerna, read the [documentation](https://lerna.js.org/docs/introduction)