<!-- Title Slide --> # Milestone 3 Presentation ![Zero-Knowledge Proofs](https://i.imgur.com/v9lyGdV.jpg) --- <!-- Introduction Slide --> ## Pushing Boundaries 🚀 We've come a long way in our quest to enhance zero-knowledge proofs for Weightless Neural Networks (WNNs). Let's dive into the exciting technical achievements. --- <!-- Rust Prover Section --> ## Rust Prover: Enhancing Efficiency Our journey began by optimizing the Rust Prover, a key component in cryptographic protocols. Our target: **Reduce amount of constraints and increase overall efficientcy.** --- ### Small Adjustments We Tried We have tried - Changing the Hash Function - Compression of Storage - Removal of Redundant Layers --- ### Exploring Folding Schemes - Investigated folding schemes like [Sangria](https://geometry.xyz/notebook/sangria-a-folding-scheme-for-plonk) and [Origami](https://hackmd.io/@aardvark/rkHqa3NZ2). - Potential to reduce constraints but needs integration into Halo2 library. --- ### Innovative Lookup Compression Techniques - Introduced a [custom compression scheme](https://github.com/zkp-gravity/optimisation-research/tree/main/lookup_compression). - Achieved a 14-fold theoretical lookup table compression. - Our sights set on a 30-fold improvement. --- <!-- WNN Section --> ## WNN: Elevating Performance We extended our research to enhance data preprocessing and feature selection for Weightless Neural Networks (WNNs). --- ### Unleashing Data Augmentation - Leveraged data augmentation to combat overfitting and boost generalization. - Caution required for smaller models. - Larger models excel with increased pattern variety. --- ### Model Reduction through Feature Selection - Developed a feature selection algorithm. - Reduced model size by up to 50% with modest accuracy drops. --- ### Feature Selection for Precision - Introduced the greedy algorithm. - Effective for larger, complex datasets. --- <!-- Future Directions Section --> ## Charting the Future Our journey continues as we chart future directions in zero-knowledge proofs for WNNs. --- ### Improved Lookup Compression - Focus on enhancing lookup compression algorithms like Lasso. - Seamless integration with cryptographic libraries. - Exploring novel compression techniques. --- ### Scaling Feature Selection - Apply feature selection algorithms to larger, complex datasets. - Evaluate performance and scalability beyond MNIST. --- <!-- Conclusion Slide --> ## In Conclusion Our journey is filled with challenges and innovations, all focused on advancing zero-knowledge proofs for Weightless Neural Networks. --- <!-- Explore Our Research Slide --> ## Explore Our Research - [Research Repository](https://github.com/zkp-gravity/optimisation-research/tree/main) - [Detailed Research Writeup](https://github.com/zkp-gravity/optimisation-research/blob/main/writeup.pdf) - [Implementation of Lookup Compression](https://github.com/zkp-gravity/optimisation-research/tree/main/lookup_compression) For a deeper dive into our technical achievements, explore our research repository, read our detailed research writeup, and examine our lookup compression implementation. --- <!-- Revisit Our Journey Slide --> ## Revisit Our Journey To see where it all began, check out our [Initial Blog Post from the Hackathon](https://hackmd.io/@benjaminwilson/zero-gravity). Thank you for joining us on this journey of exploration and innovation in privacy-preserving technologies.
{"title":"Milestone 3 Presentation","description":"Zero-Knowledge Proofs","contributors":"[{\"id\":\"57638455-1234-47cb-a685-1ef1d280e798\",\"add\":3484,\"del\":0}]"}
    232 views