## Bots: The Next Generation (June 5-16?)
- transformers: crank out as many as possible in 2-3 hours
- error-reporting, documentation, streamlining, squashing issues
- documentation: how do they work, how to spin them up, create your own etc.
- make bots smarter: reactive to market state, do it intelligently, etc.
- bug-squashing, figuring out crashes, reducing them, checking and improving precision
- generate crash reports: parse logs and report errors, what's the best format?
- auto-send in some way: discord, email, hand delivered
- data science: tell stories in notebooks
- don't rush this, rather deliver more polished product externally
- tell 1-2 stories about stuff we're building anyway (smarter bots?)
- forecasting time-series: benchmark tradfi methods in generic testbed (ElfAI Gym)
## UPDATED PLAN:
* Parity – we need to update elfpy to get up to parity
* merging the fixedpoint stuff (this week)
* finishing some baseline cross-platform tests (smart contract march vs elfpy at “old” parity)
* update smart contract to now
* update elfpy such that cross-platform tests pass again
* get_max is the most important – used by bots a lot
* Data analysis service
* NOW:
* pnl & ohlcv contract data grabbed from aws (see fuzz bots)
* FUTURE:
* separate service in docker
* holds everything that queries blockchain for information purposes
* runs stand-alone from bot execution
* generates plots that can be seen
* Fuzz bots
* DONE: Docker pipeline
* cloud compute + docker container
* anvil spins up a genesis chain
* migration scripts – deploy contracts, initialize them, etc
* artifacts
* hosts addresses
* can be used
* Elfpy compose app
* Get elf-sims image working.
* Update infra docker-compose.yml to use the bot service definition posted above.
* Refactor the bots so that they can use settings from the environment (RPC_URL, PRIVATE_KEY, MNEMONIC, etc.)
* Refactor the bots or add a line to the services command that polls http://artifacts/addresses.json. This should wait for a status of 200 and then use the provided addresses in the bot.
* Alex deploys fuzz bots
* Steps:
1) deploy
2) improve error handling
3) post-processing & investigation
* organize data from event logs
* relational (or whatever data team thinks is best) database that we can hit
* pull data from contract events
* want to hit an api and get data in a python-friendly format
* jupyter notebook preferred over dashboard
* need to change & inspect what is going on; lowest friction is best
# Previous
## Data Stuff
{%hackmd @themes/dracula %}
Big picture:
1. get some output ASAP
2. iterate 50x to get it right
## The Demo
- Cash's website has a link that says "OHLCV"
- That page has regularly updating data
- Comes from Jacob
- End to end complete
- weaving of backend from Jacob and FE
- Delivery method:
1. simplest thing is: AWS .png, streamlit
2. Cash's website
- Which plots:
1. volume (raw)
- put signs in the right place
- add it
- plot it
- conversion between base and PT (likely not)
2. PNL (no mock)
- call to HyperdriveMath library (onchain? local?)
**v0: unintegrated**
n-trades: run them on Python
rerun same n trades on Solidity
**v1: integrated on local**: everlasting_bots
for trade in [1,]
do_python
do_devnet
**v2: run on testnet**: testy_bots
```
for trade in [1,]
do_python(agents)
do_devnet(trade)
```
**v3: integrated testnet**:
```
for trade in [1,n]
do_python(agents, market_state)
-> test it with Louies and Fridas
do_testnet(trade)
do_python(update_agent)
while listen(blocks):
market_state <- Solidity
do_python(calc_things: PNL_nomock, PNL_mock)
-> export to CSV -> send to Cash
do_python(plots)
```
**v4: integrated**: future_shit
for trade in [1,]
do_python: trade_details -> market/market_state
do_testnet: trade_details -> ape_trade
throw warnings on any precision differences
### Priority list:
1. merge testnet bots PR
2. update ape PR
3. respond to plantuml guy on github
4. calculate bot PNL
5. calculate OHLCV
6. do v3: integrated testnet
7. calculate trade aggregation (python elfpy-compatible way)
8. more charts
- OHLCV historical charts
- PNL per address
- LP profitability
- ecosystem impact
- interest rates over time
9. fix devnet discrepancies (devnet = local Foundry fork)
10. stresstest the workflow w/ high trade volume (10 trades per block?)
### Mihai
tie up loose ends
- Ape PR
- py2puml PR
look into CTC
redpill Alex on BOTS BOTS BOTS
benchmark historical queries across different approaches:
1. ape
2. CTC
3. trueblocks
4. different RPC providers: Infura, Alchemy, etc.
5. remote vs. local RPC
### Jacob
- Create volume output and data structure that matches this pattern, (https://tradingview.github.io/lightweight-charts/docs#setting-the-data-to-a-series)
- Automate remaining transformations for OHLCV data.
- Identify unique address that have interacted with the Hyperdrive contract
- Create aggregate PnL data for each contract.
- Charts for non-FE data. Maybe Databricks, excel, non-centralized solution?
- Trueblocks index fix.
## Captain's Log
###### 2023.04.18
1. ~~fix bots crashing~~
2. ~~check their randomness~~
3. ~~[prepare to] land testnet bots PR~~
4. ~~pump out [a single] chart~~

##### 2023.04.19
1. ~~prep testnet bots PR~~
##### 2023.04.20
1. ~~prep testnet bots PR~~
##### 2023.04.21
1. ~~land testnet bots PR~~
2. review what data we have
3. create some charts
- he has json's
- spit out spot prices (no mock pnl)
## Path to v3 demo
#### Executive summary:
2 product types: "parallel conveyor belts" & "integrated cogs"
Focus -- integrated cogs:
* bots view elfpy market_state constructed from testnet data
* bots return trades from this state
* trades are executed on testnet using Apeworx
* bots update personal wallets from trade receipt
* cloud node & database scrapes evm for trade data
* format trade data into an endpoint
* parse trade data & construct `TradeSimVariables` sim state
* use existing elfpy code as much as possible
* from sim state, produce formatted data outputs for frontend (or streamlit) to make real-time plots (e.g. `pnl_data.json` or `trade_volume.json`).
* deliver URLs that point to time-varying plots
#### What we've done:
1. run random bots on solidity alone (`bots_on_solidity.ipynb`)
* full sim executed & returns trade tape
* trade tape is parsed & executed on solidity
2. run smart bots on solidity and python (`everlasting_bots.py`)
* bots are executing trades directly on local network
* bots view sim state built from solidity market state
3. run random bots on testnet (`testy-bots.py`)
#### Left to do:
1. integrate testnet bots with python
* combine testy-bots.py with everlasting_bots.py so that you have smart bots and random bots running trades on testnet & updating their wallets.
* read testnet data into market_state (done)
* pass market_state to agents
* test with Louies and Fridas
* update agent state after every trade
2. build cloud data processing flow
* need API python scripts that use elfpy to get computed variables
7. read testnet data into market_state
8. use elfpy for calculations: PNL_nomock and PNL_mock
9. generate plots (from python or in front-end)
**next version: integrated testnet**:
```
for trade in [1,n]
do_python(agents, market_state)
-> test it with Louies and Fridas
do_testnet(trade)
do_python(update_agent)
while listen(blocks):
market_state <- Solidity
do_python(calc_things: PNL_nomock, PNL_mock)
-> export to CSV -> send to Cash
do_python(plots)
```
Next demo:
what we have:
* trade volume on the protocol
* time
* total trades in that time
-> could convert this to ohlcv
* aws uri pointing to json file that has trade-by-trade breakdown with:
* address
* trade type ("open long")
* trade amount, in bonds (abs)
* output formats
* date, open high, open low, close
* date, contract type, pnl