# ITIP - Index Coop Analytics Operations for FrontEnd Integrations
## Abstract
Index Coop's Engineering team is looking to Product and Analytics for setting helping simplify the data sourcing and infrastructure for frond end integrations.
Currently, Data sources are spread out across a few different providers, and its hard for engineering collaborators to integrate data for new products on the Front End.
Consolidating data into a single API would help boost productivity of the engineering team and distribution capability for the IC products.
## Motivation
**Key problems we are looking to solve for the Index Coop App team,**
+ **Personnel** : Lack of clarity around DRIs for various data sourcing requirements.
+ **Data sourcing** : Multiple sources of data adding time for various integrations for the App team.
+ **External Dependencies** : Multiple external dependencies like Coingecko and Dune Analytics leading to broken feeds on the App.
+ **Integration timelines** : Data sourcing complexity leading to added time to market for IC products.
+ **Data Latency** : Data points sourced via Dune API have latency ranging from 5 mins to 2 hours.
+ **Upcoming Roadmap Needs** : icLAB is a serve UI for anyone to be able to create Set Tokens permissionlessly. This needs more robust data infrastructure, which can index data and auto updates based on Set tokens icLAB users create. Minimum data requirements for the icLAB app would be
+ Token Composition, NAV, Component Prices
+ Supply, TVL Over time
+ APY/Returns Over time
+ Revenue, Fee settings
+ Auction/Rebalancing Data Parameters
+ ..
## Background Information
### Index Coop App
Index Coop app enables users to Swap, Mint, Sell, Redeem Index Tokens. It also shows user balances and performace of Index Tokens.
The Index Coop App relies on various data sources to display information to users, including Dune Analytics, Coingecko etc.
### Coingecko API
The Coingecko API can be pinged [for Pricing and other information](https://www.coingecko.com/api/documentation) around Index Coop tokens. However, every new token's price data need to be first supported by Coingecko, which needs regular trades and arb activity to function properly. Coingecko fetches this price from CEX and DEX trading activity.
More recently, Index Coop has moved away from providing liquidity for its own Index tokens, and with no other market participants willing to provide liquidity for new Index Tokens, the Coingecko feeds working on top of DEX price data have become more unrealiable.
### Dune API
The Anaytics team has been able to scrape together NAV feed Dune APIs for Coingecko as well as use on the IC App.
The IC app also relies on the Dune API for APY and other performance data like 7D/30D/90D APR/APY to show product performance on the IC App.
Dune Dashboards are generally the only extensively used product tracking tool within Index Coop, so the queries that the Dune APIs are built on top of will continue to be maintained for a products entire lifetime.
There are currently very marginal development costs for the Analytics team to maintain these API specific queries.
However, the Dune API costs credits for every execution of the query as well as result data request. To help reduce costs, the App has had to develop caching mechanism to store/manage execution results.
Index Coop relies on various raw tx data, decoded project data, and data abstraction to write the final queries and visual data. These datasets come with varying latencies.
Below table shows latencies of various data sources (at the time of writing) on Dune.
| Example Data Set | Type | Latency/|
| -------- | -------- | -------- |
| ethereum.txs | Raw Tx Data | ~5 mins |
| Set Protocol | Decoded Project Data | ~15 mins |
| Dex Trades | Abstractions | ~40-50 mins |
| Prices | User submitted/via APIs | ~10 mins |
Note that price data on Dune is known to be incomplete/have missing data.
### Subgraphs
[Subgraphs](https://thegraph.com) index data from the EVM blockchains and organise them into data trees that can be queried and used on DAPP frontends. Most well known Defi protocols have publicly available subgraphs that anyone can use to query data using GraphQL, and integrate with frontends. Subgraphs are also useful for plugging into data sources like DefiLlama.
Subgraphs need to be deployed by Data Engineers with prior knowledge of the protocol, and can take some time to get up and running correctly.
Index Coop team has never deployed or maintained a Subgraph.
**Some background** - Some attempts were made when IC has a flexible contributor system, but we saw that the contributors needed a ton of active input from the other Analytics contributors as well as the engineering team, or atleast 1 person who understands Index Coop smart contracts, products, Analytics well. JD Cook wanted to build a subgraph with other analytics contributors, but when he departed, the project wasn't taken up/handed over to the Full time Analytics contributors as it didn't seem to be a very high priority at that time.
**Costs** The Graph lays out cost of querying [here ](https://thegraph.com/docs/en/network/benefits/). We can assume we would be a low volume user, which would cost ~15$ for 30,000 queries per month, and would scale as we grow.
### IC Analytics SDK
The Analytics SDK is an attempt consolidate all of the data used by the front end. (*Add issues with the Analytics SDK here*)
## Open Questions
Pose any open questions you may still have about potential solutions here. We want to be sure that they have been resolved before moving ahead with talk about the implementation. This section should be living and breathing through out this process.
+ Should the Analytics take over the caching setup work to make it easier for Front end team to execute?
+ Is the AnalyticsSDK going to be maintained going forward?
## Feasibility Analysis
There is no single solution to the Data issues and requirements, but we can use a combination of a few solutions below.
### Dune API + Removed dependency from Coingecko
Implement reliable NAV feeds for all IC products on Dune API and use those for prices on front end instead of Coingecko Prices.
+ Pros
+ More consistent, reliable data
+ Cons
+ Varying Latency for various datasets
+ Caching and querying setup to be done by the Front end team
+ Needs Analytics Intervention for new Products
+ Can support icLAB but with unknown latency
#### Cost
This can be though of as the lowest lift solution as it is close to what we already have, but it doesn't solve for icLAB and other dynamic data needs.
#### Risks
* Technical Debt
### Subgraph setup
Build and deploy a self-maintaining subgraph for Index Protocol
+ Pros
+ Simpler to integrate on the front end(No additional caching needed)
+ Makes distribution easier to other data sources like Coingecko, DefiLama etc
+ Can Support icLAB out of the Box (Think of New Uniswap Pools, V3 positions, all supported by Uniswap frontends)
+ Cons
+ Time to build would be high(1-2 Months)
+ Ongoing maintainance/upkeep as per Index Protocol changes
+ May still need to use a reliable API for token prices
#### Costs
* 1-1.5 Analytics/Engineering team contributors for writing the mappings and deployment
* $15-50 GRT Per Month
#### Risks
* No major risks, most Defi Protocols maintain a subgraph
### IC Analytics Backend + Data Infrastructure
Build and deploy our backend and database to store and serve data.
+ Pros
+ Higher flexibility and control
+ Cons
+ Time to build and perfect would be very high(3-4 months)
#### Costs to Deploy $$$$$$
##### Personnel
* Backend/Data Engineer
* Devops
##### Infrastructure
* AWS
* ...
* ...
#### Risks
* Downtime
* Unknown Technical Risks
## Recommended Solution
Our recommendation is to build a Subgraph that indexes all Index Token related data in real time, and helps support upcoming projects like icLAB and also support the Index Coop App with data needed for visualisations for yield, TVL, supply etc.
A Subgraph's data store updates as a reaction to certain handlers, which we need to update/define on the subgraph manifest.
Read more about available handlers on the subgraph documentation [here](https://thegraph.com/docs/en/developing/creating-a-subgraph/).
Some details on possible entites(ie data objects) and their properties with handlers to be used -
| Property | Parent Entity | Update handler |
|--- |--- |--- |
| Address | Token | Events |
| Name | Token | Events |
| Symbol | Token | Events |
| Decimals | Token | Events |
| Current Supply| Token | Call/Event |
| Market Cap | Token | Block Handler |
| Market Price| Token | Block Handler |
| Price Change 24h| Performance| Block Handler |
| Trade Volume | Token | Events |
| APY/APR| Performance | Block Handler
| Streaming Fees | Token | Events |
| Componenents| Components | Events |
| Flashmint Quotes| -- | -- |
| KPI Dashboard| Performance | Block |
| Performance Dashboard | Performance | Block |
Mapping new subgraph entities with currently used data sources in the below table -
| DataPoint/Metric | Current Source | Update Frequency | Subgraph entity |
|--- |--- |--- |---|
| Address | Token Contract | Low |Token
| Name | Token Contract | Low |Token
| Symbol | Token Contract | Low |Token
| Decimals | Token Contract | Low | Token
| Current Supply| Token Contract | High |Token
| Market Cap | Coingecko | High |Token
| Market Price| Coingecko OR Dune | High |Token
| Price Change 24h| Coingecko | Med |Token
| Trade Volume | Coingecko | High|Token
| APY/APR| Dune API or Manual via Coingecko| High |Performance
| Streaming Fees | Token Contract | Low |Token
| Componenents| API+Coingecko | Low |Components
| Flashmint Quotes| FlashmintSDK + 0x | High |--
| KPI Dashboard| Token Contract | High |Performance
| Performance Dashboard | Dune API + Caching | High |Performance
Please note that FrontEnd code for data integrations would need to be refractored, through it would be much simpler to integrate new products once the project is completed.
## Timeline
| Action | End Date | Description |
|--- |--- |--- |
| Checkpoint 1 - Recommended Solution | 15/12 | Finalize and agree on a recommended solution in collaboration with Engineering, Product, Ops, and Growth teams. |
| Checkpoint 2 - Build Requirements | TBD | Establish and agree on detailed product requirements with the Product and Engineering teams. |
| Checkpoint 3 - Implementation | TBD | Complete the development and internal testing of smart contracts. |
| External Audit | TBD | Conduct an external audit of the smart contracts with a selected partner. |
| Deployment | TBD | Deploy the finalized smart contracts onto the blockchains. |
## References
[Initial working document by Chavis
](https://docs.google.com/document/d/1ZADkUPDua9RZal53GSUbviTo8LcxTxLZwYa-kU36Cx0/edit)