rrDHT Optimize Gossip with Chunks+Fingerprints
Assumptions
- Problem 1: We currently create a bloom filter with size proportional to number of historical DhtOps and exchange it while gossiping. Ideally we would take up less space for older data that is unlikely to change.
- Problem 2: We create one of these bloom filters for each peer we gossip with, because each peer's DHT Arc has a unique overlap with ours. Ideally we would have less unique overlaps, so that we could re-use computation.
- If we align our DHT Arcs along predictable boundaries and split the Arcs into chunks, we can re-use gossip data for a given chunk across each connection that has that as overlap.
- If we compute hashes of historical DhtOps for various time ranges, we can gossip those hashes instead of a bloom filter, and in the frequent case where those hashes do not differ between peers, we can save resources.
- We want to optimize gossip to accomplish a few main goals:
- be easy to maintain (follow a single pattern in a single gossip loop)
- minimize overhead in terms of computation and bandwidth
- require the most reliable nodes to do be doing the least work (meaning, a node that has been offline knows it is going to have to get updates when it comes on, but nodes that stay on, should only be computing and gossiping new data, never old data with each other)
Overview
Chunking
- Each node represents their Arc for a given DNA as a Chunk Size and Chunk Count
- We require Chunk Size to be rounded to a power of two number of DHT Addresses, rather than a continuous range, so that it changes infrequently. Since there are 2^32 total DHT addresses, nodes can represent their Chunk Size as simply an exponent between 0 and 32. For example, 32 = a chunk is the whole DHT, 31 = a chunk is half the DHT.
- We start each Chunk at an absolute multiple of the Chunk Size, rather than at an offset from the agent address, so that they will match those of other agents.
Fingerprinting
- When gossiping about a given Chunk, we use a hash-based representation of DhtOps that has higher granularity for recent data and gradually lower time granularity for older more stable data.
- This Gossip Fingerprint is a vector of historical hashes, each associated with a time-range, as well as a bloom/vacuum filter associated with the current time-range that is under construction.
- The
current_range
contains the hashes of DhtOps with timestamps from the last 5 minutes.
- At the end of each 5 minute period, we reset
current_range
to be empty, and we append an item to history
, with a hash determined by
- querying the database for all the DhtOps in that time range
- concatenating their DhtOpHashes into a
Vec<u8>
- applying blake2b to the bytes
- Once there are three history items with the same TimeWindowSize, we collapse the older two of them into one history item with twice the TimeWindowSize and with a GossipHash that XORs the two child GossipHashes as input.
- Since TimeWindowSize is always a power of two times 5 minutes, we represent it as an exponent.
- For example
- This way, a time window N minutes long ends at least N - 5 minutes ago.
- We use a sparse tree representation for historical data so that time periods with no data have NULL values and don't need to be traversed more deeply to know they're empty.
We can optimize the structure of the fingerprint
The amount of time represented by a time window is minutes where is the number of times chunks have been combined
Meeting, 2021-11-30
Can we compute the hashes of spacetime chunks lazily, so that any mismatch in the "currently used" chunks between agents can be reconciled through lazy computation?
- We should still have a way to target the same ideal chunk sizes if nodes see similar dht state.
- It would be good if the chunks didn't change too frequently to avoid recomputing hashes.
Kinda out of scope for gossip at the moment but good to think about a bit.
There can be a threshold of time beyond which we just don't gossip chunks that fall in that region of time.
Followup, thinking about spacetime trees instead of time trees: https://hackmd.io/R6pV-Af2TFWmKQht8K5XYA
Followup to the followup, focusing on a tree in space of trees in time: https://hackmd.io/lsu-Q353S_mPz13E_sfwng