Cathie So

@cathie

Joined on Oct 2, 2022

  • by drCathieSo.eth 0. Introduction I am thrilled to share that my project on ZKML has successfully been completed with the invaluable support from the Ecosystem Support Program of Privacy & Scaling Explorations (Ethereum Foundation). This platform bridges the AI/ML and Web3 worlds, providing a privacy-preserving solution with immense potential to revolutionize both industries. This is a POC of an end-to-end platform for machine learning developers to seamlessly convert their TensorFlow Keras models into ZK-compatible versions. This all-in-one solution consists of three core components: circomlib-ml: A comprehensive Circom library containing circuits that compute common layers in TensorFlow Keras. keras2circom: A user-friendly translator that converts ML models in Python into Circom circuits. ZKaggle: A decentralized bounty platform for hosting, verifying, and paying out bounties, similar to Kaggle, but with the added benefit of privacy preservation.
     Like 6 Bookmark
  • Previous work from other groups Zator: Verified inference of a 512-layer neural network using recursive SNARKs 🐊 Folding model architecture vs inference Folding model architecturePros: get much more compression ratio as we wish per inference Cons: require models to be specifically designed for folding, which might not be practical for the current web2 industry Folding model inference Pros: do not need to modify current models
     Like 2 Bookmark
  • Click here to know more about Team Novi Proposed work schedule: Research and proposal (2 weeks) Implementation and benchmark (5 weeks) Recommendation and feature requests for Nova (1 week) 1. Research and proposal Previously, Nova has been used to fold models and batch inference. What else can we fold?
     Like  Bookmark
  • ZK-friendly ML paradigms Background For ZK Hack Lisbon, I teamed up with a few ML and cryptography veterans to build Zero Gravity, where we proposed a novel approach to efficiently prove the correctness of machine learning model calculations using Weightless Neural Networks (WNNs), which are combinatorial and do not rely on floating point arithmetic. WNNs learn from input-output pairs using RAM cells as lookup tables, making them more amenable to Zero Knowledge Proofs. The project utilizes Bloom filters to make RAM cells scalable and introduces a new non-cryptographic hash function, the "MishMash", to improve performance. Research Directions/Tasks Implementing the Zero Gravity paradigm in Plonky2/Halo2 Identify and explore other high-accuracy weightless models In particular, evaluate their "ZK-friendliness" by their potential compatibility with ZKP, taking into consideration their computational complexity and performance on real-world datasets. e.g. boolean circuits, binarized neural networks, truth table net, ... Benchmarking (see below)
     Like 9 Bookmark