---
tags: grants
---
[Our Applications](https://hackmd.io/gDDWHpPeQaGEzv2T-dW_Kg?edit)
# Apply Sovereign Tech Fund / May 2023
* info https://hackmd.io/Q6v2NwEfQbCilal5EGRlFg
* dat vision https://hackmd.io/KX9vtkrGQwuJL8VhbO-wow
---
## Brainstorming
- dat-ecosystem
- thinking to launch interoperability initiative to have dev tooling for all devs in the ecosystem to comply with the standards for interoperability
- is perfectly positioned to focus on creating such standard and involve all the projects to ensure the best standard which will work for everyone will be developed
- has to create a flow to get this standard adopted (funding + standard docs)
- we need UX guidance + plan to use Simply Secure icons to make sure UX is intuitive and adopted
- datashell will start as an ecosystem project but will later become an independent project and maintained by its own team
- datashell:
- tooling for creating interoperability
- wallet, data vault, sandbox, kernel, ...
- security: ocapn, self authenticated data structures
- links:
- Karissa blog post:
- interoperability talk & conference https://www.youtube.com/watch?v=hzIU5X7g7PI&list=PL7sG5SCUNyeYx8wnfMOUpsh7rM_g0w_cu&index=4
---
## Application name
Dat ecosystem
* [ ] I acknowledge: My application will be dismissed if it does not fit within STF’s scope of work or if it is incomplete.
* [ ] I acknowledge: Funding is denied if other grants are applied for from other public bodies for the same purpose (exclusion of duplicate funding).
* [ ] I acknowledge: I am legally able to sign contracts for this project or represent an organization that can.
* [ ] I acknowledge: All code and documentation to be supported must be licensed such that it may be freely reusable, changeable and redistributable
## Project title
May be identical with application name
Dat ecosystem interoperability initiative
## Describe your project in a sentence.
Dat ecosystem is a constallation of open source projects working on p2p local first stack to achieve data sovereignity.
@TODO: move down
The Dat ecosystem initiative works towards an open standard for GDPR-compliant data vaults for developers to make their apps interoperable by empowering users with full control over their data, seamless data portability and the ability to self-authenticate across all ecosystem applications.
## Describe your project more in-depths. Why is it critical?
300 words
!!
@TODO: write about dat ecosystem, what is its mission, vision, current work => use this on webpage & manifesto
1. open interoperability initiative working group was created by dat-ecosystem
* goal is security & interoperability between projects in the ecosystem
* problem: every project is an island built on the same stack
* user data is stored in a custom way defined by each project
* users own their data but there is no data portability or standard way of importing/exporting data
* dat in the past: one project building dat-protocol + dat toolsets
* => today: dat protocol is an independent project + dat toolsets split among independent projects within dat ecosystem
* => today: many dat protocols and many toolsets -> all made of many independent projects forming dat-ecosystem
* => today: protocols and toolsets are now part of many independent dat-ecosystem projects
* solution: an open standard for GDPR-compliant data vaults
* vault standard:
* dat-ecosystem projects can use data-vault standard in their apps
* vault standard will define some basic APIs and maybe protocols for apps to interact with the vault to access user data
* vault:
* allows users to see all data and set permissions about who can do what with their data
* stores all data used by apps
* gives users full control over their data
* enables easy backup and data portability by setting data permissions per app which allows new apps to access existing data or revokes access from previously used apps
* allows users to self-authenticate across different apps a users chooses to use
* developers can build data-vault apps for their users so users can choose which data-vault app they want to use
* implementation: this work needs to be communicated with projects
* invite project members to join discussion to make it compatible with their needs
*
---
- goal: secure interoperability in the dat ecosystem between the projects (supply-chain security)
- context:
- dat in the past:
- one project working on dat protocol and dat toolsets
- dat today:
- many projects building on top of dat stack (interdependent modules/supply chain)
- problems:
- user accounts
- get auto-generated inside of the app
- not portable across all apps (same person, new account per app)
- each app has its own it's own 'login' (keypair for each app)
- ux is not figured out (to be user friendly & secure)
- data
- is not portable between same apps
- ux is not figured out (to be user friendly & secure)
- cross app and cross modules security attacks
- downloaded apps are not sandboxed from each other (app sandboxing)
- keys or other sensitive info can be stolen/highjacked, app can interfere with other apps
- app stores act as auditors to protect users, but users download apps from many different sites
- users also often hesitant to download new apps (problems for p2p adoption)
- modules source code verification and auditing (module sandboxing)
- code dependencies in the app are not audited
- malicious modules can interfere with other modules and/or the app
- user can't verify if the app is malicious and be sure it doesn't steal their data
- solution:
- standard for data vault
- standard for sandboxing
- sandboxing aps
- standard for source code verification and auditing
- standard for data vault UX
- steps:
- ecosystem projects' needs and current solution analysis
- standards draft
- feedback and review from the projects
- implementation
- testing
- prototype of **dat environment** that implements sandboxing & source code verification and auditing standards
- supporting projects with adoption of data vault & data vault UX standards (+ sandboxing & source code verification and auditing standards standards)
## Link to project repository
## Link to project website
## Please provide a brief overview over your project’s dependencies, including your own dependencies and projects that rely on your technology.
300 words
!!
### Project's dependencies
- main dependencies: web APIs (service and shared workers, proxies, message channels, indexDB), libsodium, SSC or terminal deamon
- peer dependencies (extend capabilities of dat environment): hypercore, earthstar, dotsama
### Projects relying on this technology
- dat ecosystem apps and tools + all new projects built on dat protocols
## Which target groups does your project address (who are its users?) and how do they benefit from the funding (directly and indirectly)?
300 words
!!
- dat ecosystem projects
- indirectly:
- getting standards
- directly:
- getting funds and support for implementing new standards
- becoming securely interoperable with other apps
- giving users full & secure control and portability of their data
## How was the work on the project made possible so far (structurally, financially, including volunteer work)? If applicable, list others sources of funding that you applied for and/or received.
300 words
- 10 years of dat:
- sloan, open knight fundation, individual donations, volunteers
- during crypto boom funding for dat dried a lot
- some projects went into crypto, closed sourced part of their apps, took funding from crypto
- during last 4 years we only had 20k funding from CS&S so all work was done by volunteers for none or minimal constribution
- we want to reverse that
- related feeds (2020) discussion + prototype of the dat environment (datashell)
## What do you plan to implement with the support from STF?
**900 words**
!!
- standards
- dat environment
Please describe your objectives and the corresponding activities. State clearly how they contribute to the improvement or maintenance of the technology, especially with regard to security and resilience.
## How many days do you estimate for these activities?
1 day is 8 hours of work per person. You can only enter a number in this field.
## What is the amount of funding you are requesting, approximately?
* [ ] 150.000-250.000€
* [ ] 250.000-500.000€
* [ ] 500.000-800.000€
* [ ] >800.000€
## If you can estimate a more concrete funding amount, please enter it here (numbers only) (optional)
## In what timeframe will you perform the activities?
Please make an estimate between 3-18 months (enter number only)
## Who (maintainer, contributor, organization) would be most qualified to implement this work/receive the support and why?
300 words
## Your name/handle
## Link to your profile (optional)
(e.g. your GitHub profile)
## What is your role in this project?
- [ ] Maintainer
- [ ] Contributor
- [ ] Fundraiser
- [ ] Fiscal host
- [ ] Other
## Country and state of residence
This information is not relevant for selection but for administrative and evaluation purposes
---
Online conference discussing topics related to feeds, interoperability, and gaps between different environments. Introduction to hyper cores and possible discussion points.
Detailed Summary for [The Dat Universe - Interoperability discussion](https://www.youtube.com/watch?v=hzIU5X7g7PI) by [Merlin](https://merlin.foyer.work/)
[00:00](https://www.youtube.com/watch?v=hzIU5X7g7PI&t=0) Discussion on feeds, interoperability, and gaps in different environments
- Interoperability on network level and between apps
- Different approaches to handling multiple feeds in projects
[06:17](https://www.youtube.com/watch?v=hzIU5X7g7PI&t=377) Different projects use different approaches to deal with hypercores.
- Multi-feed model starts with a shared key and uses protocol extensions to exchange feed keys.
- Cabal uses multi-feed and kappa core to index feeds and create secondary indexes.
[11:40](https://www.youtube.com/watch?v=hzIU5X7g7PI&t=700) Discussing the need for standardization of related feeds for different projects.
- Identifying use cases for standardization, such as hosting solutions and friend-to-friend mechanisms.
- Exploring the possibility of changing to a new standard and the need for moderation features.
[17:30](https://www.youtube.com/watch?v=hzIU5X7g7PI&t=1050) Combining existing mechanisms to create a standard for software to follow
- Existing mechanisms like hyperdrive and cabal can be combined to create a standard
- Multi-feed core store compatibility can be integrated into multi-feed to make it easier to combine with hyperdrives
[23:00](https://www.youtube.com/watch?v=hzIU5X7g7PI&t=1380) Core store is a less opinionated solution for wire protocol standardization.
- Multi-feed and Cabal ecosystem had incompatible wire protocols.
- Core store provides a good place to standardize wire protocol and data exchange.
[28:55](https://www.youtube.com/watch?v=hzIU5X7g7PI&t=1735) Exploring using URLs for writing data on the Hypercore protocol
- Identifying the type of data structure being dealt with (Hyperdrive, Cabal chat, etc.) through header messages and extension messages
- Challenges in dealing with groups of Hypercores and the need for a standardized approach for pinning services
[35:00](https://www.youtube.com/watch?v=hzIU5X7g7PI&t=2100) A generic manifest feed can be used for pinning services and SSPs.
- Manifest feed lists all related feeds and their relationships.
- Data interpretation depends on individual app or data structure.
[40:35](https://www.youtube.com/watch?v=hzIU5X7g7PI&t=2435) Discussing the need for a generic standard for pulling all feeds
- Interoperability can benefit some apps and shared foundations can benefit all apps
- Discussions about interoperability should involve project buy-in and reaching out to maintainers of major projects
[45:57](https://www.youtube.com/watch?v=hzIU5X7g7PI&t=2757) Interoperability between projects is ongoing but takes time
- Existing projects need to agree on standards to avoid creating incompatible structures
- New projects should use existing standards to promote interoperability
[51:38](https://www.youtube.com/watch?v=hzIU5X7g7PI&t=3098) Exploring the use of hyperspace daemon for smaller projects
- Using hyperspace daemon can reduce responsibility of replication and networking logic
- Running multi-feed models on top of hyperspace daemon is possible but packaging and stability issues need to be addressed
[56:41](https://www.youtube.com/watch?v=hzIU5X7g7PI&t=3401) No automatic fetching of related feeds implemented in daemon
- Standards around related feeds could be implemented in the daemon
- Further discussion can happen in comcoms or consortium