# v2 Infrastructure & Decentralization Sync
Sam's doc: https://hackmd.io/o7cVYOO4SxiZ03afDDufqA
## Understanding the Stack
- Monolithic frontend React app, but let's explore more about the data
### DAOhaus Data Overview
- Lots of data sources:
- RPCs from several networks
- DAOhaus API
- Series of AWS Lambda Functions
- Block Explorer API
- ABIs from contracts for contract calls
- TheGraph
- Several subgraphs integrated throughout app
- Boost APIs
- Third-party apps
- Ceramic
- Ceramic data for profiles integrates with Basic Profile
- Other APIs
- Gets the data and combines it all together
- API is read and write
#### data.daohaus.app
- Non-contract data stored in a db that is served to frontend
- Updating metadata writes to this db
- Cache data in the API
- Cache minion vault data via jobs that run in Lambdas that make it direct for the frontend to grab
- Hydrate with USD values
- Does this across networks
- Endpoints that wrap secrets
- IPFS upload to hide Pinata keys and other boosts such as Discord/Discourse keys
#### TheGraph
- 1 Subraph per network indexing Moloch contract and other contracts
- Stats Subgraph
- Provides data over time on Treasury Vault data
- Main use is treasury / time series data
- ENS Subgraph
- Reverse lookup on Mainnet (not maintained, but we use)
- NFT Subgraphs
- Across networks, populate when folks tribute NFTs
- Boost Subgraphs
- Third party Boosts: Superfluid, Snapshot, POAP
- Boost Subgraph that we maintain that indexes factories for our boosts like *Wrap and Zap*
- Shaman Subgraph
- Only used in Yeeter so far
#### Ceramic
- Ceramic Basic Profile data for DAOhaus user profiles
### API Data Sources and Storage
(see Sam's image)
- JSOn Data stored in s3 buckets
- Airtable Database
- App hits these endpoints and pulls data from these sources and puts it into a nice wrapper that the frontend can hit
- DAO Overview Data
- Hits single endpoint that hits the Airtable and gets the data along with other data that's hydrated and then sent back to the frontend
- Caching is used to help avoid rate limits
## Moving Forward
- Goal is to move away from this structure in favor of more decentralized approaches
- Airtable and s3 buckets are a top priority to transition
- Boosts that your DAO has is stored in the Airtable API
### Airtable
- Data related to DAOs that's not on chain that needs to be managed by the DAO itself (such as metadata like name and what Boosts are active)
- Can manage proposal playlists as well
- Basically, data that isn't in a contract that we need to store
- *Poster Contract* is replacement for Airtable
- Jord's Poster Boost uses this
- Post content and a tag and it emits an Event and we can follow this in our subgraph, and can then be indexed and used
- Shows up in the `MetaData entity` as `rawContent`
- Work in progress is to standardize the schema so that there will be an onchain database
- Can start defining a schema and tags and figure out how to index it and can then move the v2 app to start using the Poster Contract
- Subgraph can check if `msg.sender` is a DAO member and then the frontend will know if it's valid
- Can also come from a minion proposal (`msg.sender` is a minion) indicates that the content is ratified by the DAO
- May need a way to backfill everything first, but this could be a good starting point to move toward getting rid of the Airtable
### s3 Buckets
- Less clear path toward decentralization and we haven't identified a clear way for the Lambdas to gather data and provide to the frontend for a few reasons:
- Wrap secret/API keys
- So much going on, the frontend fetching this isn't performant
- Reluctant to go too far on v3 in this -- there are other explorations to web3 s3 bucket alternatives
- Skynet allows us to drop JSON files that we can access
- Starting point could be trying to replicate some of the s3 functionality -- this would also allow for us to test this out for v3
- Instead of dropping directly to s3, we could start integrating Skynet
- Still not a great web3 compute layer solution, but there is promise of things
### Ceramic
- We may need to start a DAOhaus node
- In v2, we're using mainly for Basic Profile data, but in v3 we'll be using it for preference data
- In v2, we wrap the API via a Form, but we don't need to do this since folks can update via self.id
## v2 and v3 Crossover
- https://hackmd.io/wpI8-jVcQ_261_xvO6RgHA
- https://hackmd.io/QDALCjN_SeGq86-DsDkRkw
- Starting workstreams to explore things in v2 will carry over for v3 (such as the Skynet experiment)
- Find out where the unanswered questions are and use v2 as an exploration ground and for testing for some of these ideas
## Possible Next Steps
- **Poster**:
- Transitioning Airtable to use Poster could be a good first start, but the upgrade path is a bit less clear
- **Skynet**:
- Rewrite one of the caching jobs to to go Skynet instead of the s3 bucket
### s3 Bucket Jobs
- **DAO Guildbank Token List**
- Data about tokens that DAOs hold
- Price data pulled from Coingecko and cached
- Gets every token that every Guildbank holds (from subgraph) and then makes calls to Coingecko to get current price data
- Looks up token lists to get the name, logo, etc.
- Returns this data
- **DAO Minion Vault Data**
- Jobs that pull each minion and queries for ERC20, ERC721, ERC1155 balance
- Gets historical data on these to power the data graphs
- Hits the cached token data for the metadata (image, name) and the prices
- Let's experiment with storing in Skynet (sync up with Keating)
### Airtable
- What would it take to replicate the DAO data?
- First step is documenting the data stored in Airtable and where it's stored from
- What would be the work to replace each of these?
- Identify the upgrade path
- If a v2 DAO is using Airtable, can a DAO hit an update button to create the Poster metadata?
- Could we create a main address that makes a giant data dump to the subgraph?
- This would probably be ideal if we could do this
### Airtable Metadata
- `daoMeta`
- DAO metadata
- `name`
- `network`
- `description`
- Summoning a DAO for the first time creates this, and then this is updated when updating the Settings
- Checks that you're a member and then everything is stored by `contractAddress`
- `purpose`
- `slug`
- Automatically creatd
- `avatarImage`
- When editing metadata can set the DAO avatar -> IPFS -> stores the hash
- `tags`
- comma separated list
- `links`
- JSON object of social/community links
- `version`
- Version of the contract being used
- `hide`
- Helper for hiding DAO from the explore view as a safety hatch
- `createdAt`
- `activations`
- Relationship table, these are like Boosts
- `customTheme` -> `customThemeConfig`
- If editing the DAO theme (colors) this is stored in this field
- `serviceUrl`
- String for ServiceDAO urls
- `proposalConfig`
- JSON object, not used in v3 but still needed in v2
- `auditLog`
- Poster should give us this out of the box
- Any updates, who made them, and when they were made
- Field keeps track of all of this data (relationship to the audit log table)
- `longDescription`
- used for the Yeeter
- `auditLog`
- Relational table
- Who updated, what time
- Poster gives us some of this, we can pull the most recent
- `allies`
- Getting deprecated -- this is everything related to UberHaus
- Will be likey deprecating this entire section of the App (Allies) so we won't need to support this
- `boosts`
- Table
- `id` and `name` and `key` are same now
- Makes sure that Boosts are valid
- Have some control over Boosts that are added, but didn't matter as much since we were the only ones adding Boosts
- `activations`
- Table connecting the DAOs and the Boosts via Lookups
- `boostMetadata`
- JSON added and used in the frontend if specific Boosts need it
## Initial Schema Ideas
- Currently there is a single `MetaData` table that looks for matches to `metaData` and writes it here
- Could have a generic `Post` table
- `metaDataConfig`
- `boostConfig`
- Post entity could have a type on it -> DAO query gives:
- latest data of type Boost, MetaData, etc.
- Think through how we want to structure this?
- Poster:
- `content` and `tag`
- `content`
- Directs what happens in the subgraph mapping
- Structured to tell us if something is metadata and what sort of check is needed to make sure that it's a valid post
- Types of valid posts:
- Minion and Member
- MetaData
- Mapping checks `msg.sender` to see if member of the DAO
- Let's define these tags and the validation strategies
- What does this raw content JSON need to look like? What is the schema?
## Allies
- To deprecate this we could start with removing the frontend code for this
- Opportunity for code cleanup: Remove and make sure we didn't break anything with this
- There is an s3 bucket that dumps all DAO metadata since there are rate limits in Airtable
- API requests this every minute or so and then structures it
- *Remove Allies Data From API* -- [potential ticket](https://github.com/HausDAO/daohaus-app/issues/1814)
- *Remove Allies From Frontend* -- [potential ticket](https://github.com/HausDAO/daohaus-app/issues/1812) and [ticket](https://github.com/HausDAO/daohaus-app/issues/1813)