changed 2 years ago

Batch Addition for Multi-Scalar Multiplication (MSM)

Problem

Given two sets of EC points \(\{P_i\}\) and \(\{Q_i\}\), \(i \in \{1,...,n\}\), calculate \(\{R_i = P_i + Q_i\}\)

Native Approach

If \(P = (x_P, y_P)\), \(Q = (x_Q, y_Q)\), and \(R = P+Q = (x_R, y_R)\), an addition operation on an elliptic curve in short Weierstrass form \(y^2 = x^3+ax+b\) is defined as below [1], \[ k = \begin{cases} \frac{y_Q - y_P}{x_Q - x_P} &\text{if } P\not=\pm Q \\ \frac{3x_P^2+a}{2y_P} &\text{if }P=Q \end{cases}\\ x_R=k^2-x_P-x_Q\\ y_R=k(x_P-x_R)-y_P \]

We observe that each addition requires an inversion operation that is way more expensive than muliplication and addition [2].

Therefore, we want to reduce the number of inversions needed to caculate \(\{R_i\}\).

Batch Addition

Consider n pair of points, \[P_i = (x_{P_i}, y_{P_i}), Q_1 = (x_{Q_i}, y_{Q_i}), i \in \{1,...,n\}\] We want to calculate \[R_i = P_i + Q_i = (x_{R_i}, y_{R_i})\] Denote \[ \lambda_i = x_{Q_i} − x_{P_i} \] The product of all \(\lambda\)s left to \(\lambda_i\) yields \[ L_i = \prod_{j=1}^{i-1} \lambda_j\] Then, the product of all \(\lambda\)s right to \(\lambda_i\) is \[ R_i = \prod_{j=i+1}^{n} \lambda_j \] With the results above, we can calculate inversion as the following \[\frac{1}{x_{Q_i} − x_{P_i}} = \frac{1}{\lambda_i} \\ = \frac{L_i R_i}{\prod_i \lambda_i} \]

Since \(\prod_i \lambda_i\) does not change with \(i\) and can be precomputed, the same inversion result can be shared across \[R_i = P_i + Q_i\].

Therefore, we reduced the number of inversions required to calculate \(N\) additions from \(N\) to \(1\).

References

[1] https://crypto.stanford.edu/~dabo/cs255/lectures/ECC.pdf [2] https://www.notion.so/Known-Optimizations-for-MSM-13042f29196544938d98e83a19934305#9d8b79321f584477ac945a738042c396 [3]https://hackmd.io/@tazAymRSQCGXTUKkbh1BAg/Sk27liTW9#Optimization-1-Batch-Affine-for-Point-Addition

tags: msm zkp public
Select a repo