# Using log-derivatives to check copy constraints
**Goal**: use log-derivative method to evaluate PLONK copy constraints. If both PLONK copy-constraints and PLONK lookups are evaluated via log-derivative methods, we can combine their inverse polynomial commitments into a single inverse polynomial. Would improve Prover times.
### Naive adaptation
Take the PLONK permutation argument for a PLONKish system with $k$ wires:
$$
\prod_{i \in B} \prod_{j =0}^{k-1} \frac{(w_{j, i} + S_{id j, i}\beta + \gamma)}{(w_{j, i} + S_{\sigma j, i}\beta + \gamma)} = 1
$$
(equation 1)
(B = boolean hypercube, kludge notation. $\{ \beta, \gamma \} \in \mathbb{F}^2$ are random challenges. $S_{id 0}, \ldots S_{id {k-1}}$ are identity permutation vectors, $S_{\sigma 0}, \ldots S_{\sigma {k-1}}$ are copy constraint permutation vectors)
When translated into a grand-product polynomial, the algebraic degree of the permutation identity is $k + 1$
The identity produced when applying the logarithmic derivative technique is the following:
$$
\sum_{i \in B} \sum_j^{k-1} \frac{1}{w_{j, i} + S_{id j, i}\beta + \gamma} - \frac{1}{w_{j, i} + S_{\sigma j, i}\beta + \gamma} = 0
$$
(equation 2)
To evaluate this identity we compute a vector of inverses $I$, where $I_i$ is defined as the following (for all $i \in B$):
$$
I_i = \frac{1}{\prod_{j=0}^{k-1}(w_{j, i} + S_{idj, i}\beta + \gamma)(w_{j, i} + S_{\sigma j, i}\beta + \gamma)}
$$
(equation 3)
The permutation identity becomes the following:
$$
\sum_{i \in B} \sum_j^{k-1} I_i \cdot \bigg( \prod_{l = 0, l \ne j}^{k-1} (w_{l, i} + S_{idl, i}\beta + \gamma) \prod_{l=0}^{k-1}(w_{l, i} + S_{\sigma l, i} + \gamma) -\\ \prod_{l = 0}^{k-1} (w_{l, i} + S_{idl, i}\beta + \gamma) \prod_{l=0, l \ne j}^{k-1}(w_{l, i} + S_{\sigma l, i} + \gamma) \bigg) = 0
$$
(equation 4)
The algebraic degree of the above is $2k + 1$
Conclusion: directly converting the PLONK permutation argument into a logarithmic-derivative sumcheck blows up the algebraic degree of the permutation identity
### Can we improve on the above?
Q: does the following satisfy a permutation argument?
$$
\sum_{i \in B} \sum_{j = 0}^{k-1} \frac{w_j, i}{S_{id j, i} + \gamma} - \frac{w_j, i}{S_{\sigma j, i} + \gamma} = 0
$$
(equation 5)
A: Yes. See [below](#Why-Eq-5-is-enough).
I think we might be able to borrow the security proof from Boneh Boyen signatures (very old paper, could be not helpful)
Assuming that the set of values in $S_{id}$ equals those in $S_{\sigma}$, *and* $S_{id}, S_{\sigma}$ contain no duplicates, we can model the set of positive denominator terms in the above relation to be linearly independent...I think. Same applies for the negative denominator terms.
The above can be rearranged to the following:
$$
\sum_{i \in B} \sum_{j = 0}^{k-1} \frac{w_{j, i} (S_{\sigma j, i} - S_{id j, i})}{(S_{id j, i} + \gamma)(S_{\sigma j, i} + \gamma)} = 0
$$
(equation 6)
If we precompute $U_{j, i} = S_{idj, i}S_{\sigma j, i}$ this reduces to:
$$
\sum_{i \in B} \sum_{j = 0}^{k-1} \frac{w_{j, i} (S_{\sigma j, i} - S_{id j, i})}{U_{j, i} + \gamma(S_{id j, i} + S_{\sigma j, i}) + \gamma^2} = 0
$$
(equation 7)
This results in an Inverse vector with the following structure:
$$
I_i = \frac{1}{\prod_{j=0}^{k-1}(U_{j, i} + \gamma(S_{id j, i} + S_{\sigma j, i} )+ \gamma^2)}
$$
(equation 8)
Finally the equation to validate copy constraints becomes the following:
$$
\sum_{i \in B} \sum_{j = 0}^{k-1} {w_{j, i} (S_{\sigma j, i} - S_{id j, i}) \cdot I_i} \prod_{l = 0, l \ne j}^{k-1} ({U_{l, i} + \gamma(S_{id l, i} + S_{\sigma l, i}) + \gamma^2}) = 0
$$
(equation 9)
The algebraic degree of the identity is now $k + 2$ instead of $2k + 1$
If all of the above holds, perhaps this could be a viable alternative to the regular permutation argument (algebraic degree is similar but requires $k$ additional precomputed selector polynomials)
Q: Can we improve on the above to further lower the algebraic degree w/o a blowup in number of precomputed selector polynomials?
Q: Is there something more elegant we can use to prove copy constraints? Having a quadratic denominator term in 2 precomputed selector polynomials feels ...ugly? Do we have redundant information encoded into the polynomial identity?
### Why Eq. 5 is enough.
Although Eq.5 was motivated by log-derivative, the following proof does not use log-derivatives (there is no need to argue via preimages under log-der!). Let us also explicitly comment here that $B$ will be seen as a subset of $\mathbb F.$
We consider the easiest case $k=1$.
In this case, the equation 5 is the identity
$$
\sum_{i\in B}\left(\frac{w_i}{i+\gamma} - \frac{w_i}{\sigma(i)+\gamma}\right)=0.
$$
(We slimmed down the double indexing.)
Since $\gamma$ is random, having this equality means
$$
\sum_{i\in B}\left(\frac{w_i}{i+X} - \frac{w_i}{\sigma(i)+X}\right)\equiv0
$$
(equation 10)
holds in $k(X).$ Rearranging the terms,
$$
\sum_{i\in B}\frac{w_i - w_{\sigma(i)}}{i+X}\equiv 0
$$
Therefore $w_i = w_{\sigma(i)}$ for each $i.$ (This can be seen, for example, by multiplying both sides with $\prod_{i\in B}(i+X)$ and evaluating the resulting polynomial at each point in $B.$)
# Alternative log-derivative copy constraint argument
<!--
The Plonk permutation argument validates that, for all $i \in [n]$, $\sigma(i)$ for a mapping function $\sigma$
For $f,g \in \mathbb{F}_{<d}[X]$ and a permutation $\sigma : [n] \rightarrow [n]$, we write $g = σ(f)$ if for each $i \in [n]$, $g(i) = f(\sigma(i))$.
-->
Goal: produce a "permutation" argument whose commitment cost (measure in scalar multiplications) is proportional to the number of unique wire values and not the number of gates.
Naive approach: if $\sum_{i \in [n]} \frac{w_i}{i + \gamma} - \frac{w_i}{\sigma(i) + \gamma}$ works, what about the following relation:
$$
\sum_{i \in [n]} \frac{i}{w_i + \gamma} - \frac{\sigma_i}{w_i + \gamma}
$$
If the above holds (it does not but will try to fix below), the inverse polynomial contains $[n]$ unique terms. For a configuration of Plonk/Honk with one wire commitment, this is equivalent to the number of distinct wires.
### Problem with the above
The $\frac{1}{w_i + \gamma}$ terms are not linearly independent if $w_i = w_j, i \ne j$.
### Attempt at a fix
We use the permutation argument to validate copy constraints. $\mathcal{T} = \{ \mathcal{T}_1, \ldots, \mathcal{T}_s \}$ is a partition of $[n]$ into disjoint blocks. Our constraints copy-satisfy if $w_i = w_j$ and $i, j$ belong to the same block in $\mathcal{T}$.
We assume existence of a set of "coset generators" (not sure this is the right word) $\{ g_1, \ldots, g_s \}$ where for all $(i, j, k, \ell) \in [n]$ and $(a, b) \in [s]$ ($i \ne j \ne k \ne \ell, a \ne b$):
$$
i.g_a - j.g_a \ne k.g_b - \ell.g_b
$$
We define a mapping function $\mu$ that maps $i \in [n]$ into $\{1, \ldots, s \}$. $\mu$ describes the partition that $i$ is assigned to. (note: this implies $\mu_i = \mu_{\sigma(i)}$)
The relation to validate the copy constraint relation becomes:
$$
\sum_{i \in [n]} \frac{g_{\mu_i} \cdot i}{w_i + \gamma} - \frac{g_{\mu_i} \cdot \sigma_i}{w_i + \gamma}
$$
Note: when compiled into a polynomial IOP, the numerator term $g_{\mu_i} \cdot i - g_{\mu_i} \cdot \sigma_i$ can be represented as a single precomputed polynomial.
Q: Is the above sufficient to check copy constraints?
Attempt at very rough reasoning as to why this could work:
1. each $\frac{1}{w_i + \gamma}$ term is linearly independent from $\frac{1}{w_j + \gamma}$ for all $(i, j) \in [n]$ if $w_i \ne w_j$
2. Each numerator term $g_{\mu_i} \cdot (i - \sigma_i)$ is similar to the "counts" term in log-derivative lookups
3. The "counts" terms for each disjoint block $j \in [s]$ do not overlap