Try   HackMD

The Road Not Taken

Now that the Ethereum Protocol Fellowship has kicked off, it's time to choose between which project to focus on building (see A Fork in the Road for context).

Long story short: I've decided to continue working on Model Danksharding.

Danksharding is the medium to long-term scaling solution that Ethereum will implement with rollups, where computation of transactions are moved off chain, while settlement is broadcasted on chain.
For more information check out Model Danksharding's readme.

Creating a model of Danksharding is a big task, but I believe that taking a "breadth first" view of the project will help facilitate a great understanding of the underlying system. The idea is to oscillate between building out major functionalities within data availability sampling and peer to peer networking while stubbing complexities until subsequent rounds.

The trick is to identify benchmarks for the layers of complexity.

There are other implementations in the works, but I

  1. Want to implement the functionality and design myself to get a deep understanding of the architecture and technologies behind Danksharding.
  2. Believe that this project could be a good educational resource for those wanting to learn about Danksharding.

By the time implementations of Danksharding are at the forefront of client teams' minds, I plan to have a fundamental understanding of the system and be of real value to Ethereum.

This Week's Summary

This past week was spent researching Danksharding- specifically the mathematics behind KZG polynomial commitments, Danksharding's overall architectural design, and writing up an overview (within my project's readme) of the major aspects of Dansharding and how they all fit together.

I'm still working on the readme doc!

Also listened to the two EIP-4844 Implementers' calls (:

Questions:

  1. What are the best benchmarks for the project's increasing depth and complexity?
  2. What is the best way to "black box" certain portions of topics, then come back and implement?
  3. How should I go about instantiating my own versions of lagrange interpolation and KZG Commitments?

Next Steps

  1. Continue researching:
    • Data Availability Sampling (Erasure Coding and Polynomial Commitments)
    • Peer to Peer Networking
  2. Update KZG and P2P sections of project's readme!
  3. Brainstorm benchmarks with increasing complexity. It might be time to reach out to a mentor TBD

Notes

Rolling documents of thoughts and resources can be found within my Notion docs.

For composability of concepts, see Model Danksharding's readme!