# Proposal for Trinity V1
## Table of Contents
- [1. Introduction](#1-Introduction)
- [2. Motivation](#2-Motivation)
- [Problem Statement](#Problem-Statement)
- [Why Now? (Urgency)](#Why-Now-Urgency)
- [For Whom?](#For-Whom)
- [3. Proposed Approach](#3-Proposed-Approach)
- [Data Provenance for MPC](#Data-Provenance-for-MPC)
- [Laconic OT & Reduced Round Complexity](#Laconic-OT-amp-Reduced-Round-Complexity)
- [MPC + ZK Integration](#MPC--ZK-Integration)
- [4. Development Roadmap](#4-Develpoment-Roadmap)
- [5. Areas of Research](#5-Areas-of-Research)
- [Appendix](#Appendix)
- [Appendix A - KZG Commitments for a Bit Vector in PLONK](#Appendix-A-–-KZG-Commitments-for-a-Bit-Vector-in-PLONK)
- [Appendix B - Useful Links and Resources](#Appendinx-B---Useful-Links-and-Resources)

*Vivek's scheme illustration*
## 1. Introduction
**Trinity** is a protocol born out of Cursive’s ongoing work on P2P computation over private inputs. Existing approaches often face significant drawbacks—like MP-FHE, which demands high bandwidth, or traditional Garbled Circuits, which require multiple rounds and continuous liveness. The scheme tackles both round-complexity and liveness assumptions in MPC. Furthermore, its offers a way to combine MPC with data provenance framework ensuring the authenticity of inputs—without revealing private details—unlocking new possibilities for privacy-preserving P2P marketplaces (e.g. Private Markets), social applications, and gaming.
To achieve this, Trinity combines three primary components:
1. **Garbled Circuits** for MPC
2. **Extractable Witness Encryption for KZG Commitments for Efficient Laconic OT**, which reduces the round complexity of Garbled Circuit MPC
3. **PLONK** (powered by KZG as its PCS), enabling construction of a KZG commitment from a bitvector within a SNARK—thereby offering trustless guarantees about data provenance that feed into the Laconic OT process for key encapsulation. Learn more in [Appendix - A](#Appendinx-A---Approach-to-get-bit-vector-KZG-commit-from-PLONK).
By weaving these components together, Trinity facilitates a trustless environment that is both verifiable and privacy-preserving. A [preliminary version](https://github.com/cursive-team/trinity-v0) has been built by Vivek, the scheme’s author; however, some pieces remain either missing or tailored to a specific use case.
**Proposal Goal:** This document advocates for developing a comprehensive, open-source Trinity library that merges MPC with ZKPs in a trustless setting. The main objectives for Trinity v1 include:
* Integrating the ZK component into Trinity
* Unifying all features under a consolidated Rust and JavaScript library stack
* Building on-chain infrastructure for proof verification
* Enhancing developer experience (devx) and documentation
Trinity can become a powerful platform that reimagines Data Provenance ZKPs—allowing participants to use bridged data for greater privacy and more meaningful exchanges. This platform could also lead to connection with researchers interested in advancing verifiability of inputs in multi-party settings.
## 2. Motivation
### Problem Statement
- **Data trust in MPC**: Many existing MPC frameworks assume that participants’ inputs are correct, without requiring any cryptographic proof of origin or authenticity. This might be acceptable in certain “2PC-is-for-lovers” scenarios—where proving one’s “love” is impossible—but it becomes problematic in use cases like Yao’s millionaire problem, where untrusted data could undermine fairness or correctness. With data provenance, participants can prove revenue from a signed tax report, for example, making the result much more reliable.
- **Limited verifiability of input sources**:
While there is considerable research on verifying the correctness of MPC outputs (e.g., by running MPC inside a SNARK), less attention has been paid to ensuring the trustworthiness of input data itself. One approach is to verify signatures or authenticity inside the garbled circuit, but this tends to be costly and single-use. By contrast, ZKPs for input verifiability offer a more efficient and reusable solution.
- **Reducing communication overhead**:
Traditional MPC protocols can be cumbersome, requiring multiple communication rounds. By leveraging Laconic OT and proven data inputs, Trinity can compress the “commit-and-garble” steps into a single round.
### Why Now? (Urgency)
* **Few solutions exist for trustless data provenance in MPC**. Though frameworks integrating ZK proofs and MPC have emerged, they tend to focus on verifiability of the computation rather than verifiability of inputs.
* **Rise of data provenance technology**: With the increasing focus on authentication, wallet infrastructure, and credential marketplaces, a more “cypherpunk” perspective—enabling computation over privately owned data—fills a critical gap.
* **P2P use cases**: Private, trustless data sharing is of growing importance for marketplaces, job boards, rentals, gaming, and similar decentralized applications. Trinity aims to meet these use cases by ensuring correctness and privacy.
### For Whom?
While Cursive may be one of Trinity’s initial adopters, the real promise lies in providing a comprehensive library that lowers barriers for developers of all stripes. By offering a streamlined approach to privacy-preserving computation and data provenance, Trinity invites broader participation from teams looking to integrate ZKPs and MPC into their products. Ultimately, this flexibility not only accelerates adoption but also inspires new narratives in trustless, privacy-centric solutions—and encourages more developers to commit to building with this emerging stack. The ETHIndia Cursive bounty has sparked a lot of interest among developers in the community.
## 3. Proposed Approach
### Data Provenance for MPC
Trinity aims to integrate ZKPs, using [Halo2 PLONK with KZG exposed via the Noir DSL](https://github.com/Ethan-000/halo2_backend?tab=readme-ov-file), combined with decentralized proof verification, to validate the authenticity of input data. For example, a user might prove they are from a specific country by setting `bit_i = 1` based on a credential signed by a legitimate authority—without revealing other personal details. This KZG commitment can then be uploaded on-chain as a fixed data endpoint for MPC peers to reference.
### Laconic OT & Reduced Round Complexity
To address communication overhead and liveness requirements, Trinity applies insights from the [Extractable Witness Encryption for KZG Commitments for Efficient Laconic OT](https://eprint.iacr.org/2024/264) paper to Garbled Circuits:
- **One-Round Commit-and-Garble**
Collapses the commit phase and garbling process into a single round, minimizing round complexity and improving the overall UX. This is particularly useful in high-latency or unreliable network environments. It also simplifies the developer experience by removing the need to manage state across multiple rounds, making 2PC more practical in real-world applications. The paper achieves that by replacing the usual OT construction, ecrypting toward a public to encryption toward a KZG opening.
- **KZG as a unifying layer between ZK and MPC**
Although the paper primarily leverages KZG commitments for efficient laconic OT, it can also serve as a bridging mechanism that brings together zero-knowledge proofs and multi-party computation. In this role, KZG commitments help ensure data verifiability.

### MPC + ZK Integration
- **Hybrid Approach**
Combine standard 2PC circuit design with tools like [Boolify](https://github.com/voltrevo/boolify?tab=readme-ov-file), [Summon](https://github.com/voltrevo/summon), and [mpz](https://github.com/privacy-scaling-explorations/mpz?tab=readme-ov-file). Future iterations will explore additional MPC extensions.
- **Optional Usage**
Trinity allows users to specify whether they need both MPC and ZK or just MPC, removing strong assumptions about data provenance when not required.
It can work in 3 settings:
- ZK with high trust assumption (e.g., Import from signed data)
- ZK with no trust assumption
- No ZK
This flexibility provides a more streamlined MPC flow for scenarios where ZK commitments aren’t necessary.
## 4. Develpoment Roadmap
**Time**: 3 months
**People**: 1 FTE
1. **Milestone 1**
* Objective: Plug in Halo2 for KZG commitments.
* Description:
* Integrate Halo2 to issue KZG commitments on data.
* Set up a minimal working example demonstrating data commitment and proof generation.
2. **Milestone 2**
* Objective: Strengthen integration with Noir and bitvector handling for better devx.
* Description:
* Develop Noir modules for easy setting of bitvectors in circuits.
* Map bitvector positions to the correct advice columns in the circuit.
* Ensure 1 to 1 correspondance between KZG commitments from ZK Prover and [EWE Paper's implementation](https://github.com/rot256/research-we-kzg).
3. **Milestone 3**
* Objective: Build the infrastructure for on-chain commitment of dictionnaries.
* Description:
* Develop Soldity KZG Verifier for KZG Commit recording.
* Build queries to be run in MPC application flow.
4. **Milestone 4**
* Objective: Consolidate entire solution into a single Rust library.
* Description:
* Merge MPC circuit compilation and ZK circuit generation into one cohesive workflow.
* Provide a unified API that allows users to opt into either (1) pure MPC, or (2) MPC + ZK.
* Ensure end-to-end examples to confirm correctness and performance.
5. **Milestone 5**
* Objective: End-2-End application, serving as an example for developers willing to integrate the scheme.
* Description:
* Integration of the library into [mpc-framework](https://github.com/voltrevo/mpc-framework)
* Next.js application that will perform the entire E2E flow
## 5. Areas of Research
1. **ZK from Folding**
* Investigate using folding schemes, as a way to get a more efficient ZK prover requiring less memory, but also to leverage IVC as an incremental object where end users could maintain 1 endpoint, in a smart contract, to their dictionnary and add bits of data incrementaly to it over time.
2. **Extend EWE for KZG to HyperKZG**
* Explore how using the same KEM construction using multilinear polynomials could improve the current protocol.
3. **Extend EWE to Post-Quantum**
* Explore alternatives to classical elliptic curves (e.g., isogeny-based, lattice-based) to future-proof the system against quantum attacks.
# Appendix
## Appendix A – KZG Commitments in PLONK
We want to generate a KZG commitment to a bit vector without revealing the underlying bits. A straightforward approach is to compute KZG.Commit within a SNARK, using the bit vector as a private input and returning the resulting commitment. However, running MSMs (multi-scalar multiplications) inside a SNARK is expensive. Since we are already using PLONK, we can leverage its built-in KZG commitment functionality, which the Prover handles natively.
In this approach:
1. We place the bit vector into an advice column in our PLONK circuit.
2. We impose a constraint (for example, by verifying a signature) to ensure that each entry in the advice column remains a valid bit (0 or 1).
3. The Prover’s native KZG commitment of this column is then computed as part of the proof.
4. We can extract this commitment from the proof, obtaining our desired KZG commit to the bit vector.

A small reference example can be found here: https://github.com/Meyanis95/halo2-kzgewe
Because writing Halo2 circuits directly can be challenging, we plan to use Noir with the PSE Halo2 backend. This way, developers can write their circuits in Noir, and we will expose a component that allows them to specify which bits in the circuit should appear in the resulting KZG commitment.
## Appendinx B - Useful Links and Resources
- [Cursive presentation at ZKSummit](https://youtu.be/-0JdI5QG7yg?feature=shared&t=1051)
- [Extractable Witness Encryption for KZG Commitments
and Efficient Laconic OT Paper](https://eprint.iacr.org/2024/264.pdf)
- [Trinity V-0](https://github.com/cursive-team/trinity-v0)
- [A Glimpse into ProgCrypto’s Future Blogpost](https://hackmd.io/@meyanis/S1fIC-DM1x)
# FAQ
**Q: Halo2 support - Would this be building on PSE’s version of Halo2? Do you know we put that in maintenance and aren't planning on supporting it much moving forward? **
A: Yes. In this first sprint, it's plan to use the PSE Halo2 backend exposed through Noir, as mentioned in the proposal. I’m aware that the library is in maintenance mode and will not be developed further; however, based on my analysis and initial tests, the current feature set is sufficient to prototype Trinity.
You can find an example here here: [halo2-kzgewe](https://github.com/Meyanis95/halo2-kzgewe)
**Q: KZG vs. PQC – During precon, we discussed a desire to steer toward quantum resistance. That may take time, but for now KZG might be less concerning. Thoughts? **
A: Exactly. In the proposal, it's noted quantum-resistant commitment schemes as a next iteration of research, aiming to extend the witness encryption approach to PQ-friendly systems. As you said, for delivering practical primitives in a 3–6 month timeframe, lowering the PQ security assumption is acceptable for the prototype.
**Q: How do you envision real-world use cases in the next 6–12 months? It feels like traditional MPC or POD2 already address many problems. **
A: Regarding the risk of the library being only a “toy,” is mitigated by Cursive’s commitment to use it. Cursive has a proven track record of delivering primitives to a large user base.
As for broader use cases, P2P computation on verified secret inputs (secured by snarkified signatures verification) applies to a wide range of real-world scenarios where digital signatures are prevalent. For instance, a renter could use a digitally signed tax document as an MPC input, allowing the landlord to verify income thresholds without learning the exact amounts.
From what I understand, POD2 doesn’t currently offer MPC out of the box. If we want to jointly compute over our POD data, one of us must pull the other’s POD or send it to a server—either way, potentially exposing the data in the clear. And for traditional MPC you can add the signature verification algorithm in the Garbled Circuit, but it’s very costly and can only be used once. With Trinity, the idea is to have a witness encryption mechanism that ties a data signature to an MPC garbling process so participants do not learn each other’s inputs.
**Q: For things like the millionaire's problem a zkp of the signed data from the attester (the notary) should be enough / as good as you are going to get unless you have signed data from a bank directly. In a more real world use case, lets say you wanted to take a TLSNotary attestation of your bank account, your investment account, and some other accounts, and a signed message from your ethereum wallet/multisig; you could combine all of that with POD2 and have that as an input for MPC. This wouldn't save the extra round, but its way more composable and practical imo. **
A: True. However, POD2 currently doesn’t hide public inputs, so any joint computation still requires revealing them. Also, I’m focusing on the asymmetric case of data provenance—e.g., verifying cryptographic signatures from banks, emails or government entities. I’m not entirely certain how TLSNotary proofs fit into this model yet but I’m eager to search about that.
In the current Trinity construction, the “glue” between ZK and MPC is the witness encryption KZG scheme. This lets the bit-choice options in the garbled circuit be bound to the same commitments that a ZK prover would use for signature verification constraints. Essentially, it unifies the proof of valid input data with the secrecy of MPC.
If we believe strongly in the importance of data provenance from signed data—especially given the huge volume of attested documents that could be “snarkified”—then it makes sense to enable peer-to-peer computation where each party only learns the result, not the underlying data.
In my experience (e.g., working on Anon Aadhaar and Digilocker), one might initially build a super engineered “gate” to verify such data, but that raises risks of misuse or discrimination. By contrast, a P2P approach with secure MPC ensures that sensitive data is never revealed unnecessarily, while still using data provenance property as a way to trust each party.
Greco (from Enrico) was another approach relying on FHE, which has higher bandwidth costs. Our proposal is to provide a more bandwidth-efficient MPC library that preserves the verifiability of inputs (via ZKPs) and the confidentiality of the computation.