--- tags: firo, audits, clients --- # Lelantus Spark Code Audit Checklist Repo: https://github.com/firoorg/firo/tree/spark_second_iteration ## Main Flaws During the audit, we are mainly concerned with looking for bugs that can lead to: - Double spends - Minting new coins for free - Deanonymizing transactions - Leaking transaction amounts - Leaking of spending keys - Unauthorized spends are not possible. An authorized spend means the user has the spend key - It is not feasible to link transactions to each other or to (diversified) addresses - It is not feasible to link diversified addresses ## Scope - src/libspark - src/spark ## To check - Regular Cryptography bugs - [ ] : Ensure that all cryptography operations in Lelantus Spark are constant-time. There are certain portions that may be fine to not be constant time in the case of certain optimizations. This will need to be discussed on a case by case basis with the Firo team - Note: emphasize non-local/remote timing attacks - [x] : Avoid branchings controlled by secret data - [x] src/libspark - [x] src/spark - [x] : Avoid table look-ups indexed by secret data - [x] src/libspark - [x] src/spark - [x] : Avoid using secret data for loop bounds - [x] src/libspark - [x] src/spark - [x] : Ensure that a strong source of randomness is used for any operations needing random variables - [x] : Ensure shifted values are typecasted in order to minimize undefined behavior - [x] src/libspark - [x] src/spark - [x] : For any curve/group operations, ensure that the result is a valid curve/group element - [x] : Ensure that any value that is passed to bulletproofs for range checking is indeed in the range [0, $2^k$] for some $k$ - [x] : Check for common edge cases in critical cryptography functions - [x] : Check if a negative value passes when it shouldn't - [ ] : Check if 0/point at infinity passes when it shouldn't - Advanced Cryptography bugs - [x] : Check that fiat-shamir transform is properly implemented based on what is needed for the transcript - Note: checked for inclusion of all public values and all the public value proof generated till that point while generating the challenge. - [x] : Check that batch verification doesn't attest to some state that a user didn't sign - [x] : Check that the composition of various cryptographic primitives are safe - [x] : Ensure that combination of KDFs and other primitives are safe - C++ Security bugs: - [x] : Memory is cleared after secret data has been used - [ ] : Ensure that C++ compilation doesn't remove security critical operations - [x] : Ensure unsigned bytes are used to represent binary data - [x] : Check for values that can potentially overflow or underflow - [x] : Ensure that data/variables are initialized with some default value - Miscellaneous bugs: - [x] : Check log files to ensure that no confidential data is exposed - [ ] : Dead code - [x] : Remove unecessary dependencies - [ ] : Update dependencies for which critical/high fixes have been made - Tools to run: - [ ] : (OPTIONAL) Run AFL++ fuzzer - [ ] : Run Honggfuzz fuzzer - [ ] : (OPTIONAL) libfuzzer - [ ] : (OPTIONAL) Sanitizers - [ ] : Valgrind - [ ] : Clang sanitizers - [ ] : Google sanitizers ## Potentially Relevant Audit Reports - [Bulletproofs+ Audit Report](https://suyash67.github.io/homepage/assets/pdfs/bulletproofs_plus_audit_report_v1.1.pdf) - [Quarkslab's Bulletproof Audit Report](https://blog.quarkslab.com/resources/2018-10-22-audit-monero-bulletproof/18-06-439-REP-monero-bulletproof-sec-assessment.pdf) - [Kudelski Security Bulletproof Audit Report](https://cybermashup.files.wordpress.com/2018/07/monero-audit2.pdf) ## Potential Findings 1- In, **src/libspark/bpplus.cpp** -> https://github.com/firoorg/firo/blob/spark_second_iteration/src/libspark/bpplus.cpp - **Missing checks if challenges are equal to 0 in various places.** **->** In Line 255, transcript is updated by using L and R, then challenge e is computed. "e" is not checked if it is equal to 0 in Z_p. **->** In Line 286, transcript is updated by proof.A1 and proof.B. Then, e1 is calculated. After "e1" is calculated, "e1" is not checked if it is equal to 0. * Write test cases for the zero vector (vector in which all elements are zero) and identity vector (vector in which all elements are one) ``` // In Line 155, 387 Scalar y = transcript.challenge("y"); if(y == ZERO){ MINFO("y is 0, try again"); goto try_again; } //In Line 156, 395 Scalar z = transcript.challenge("z"); if(z == ZERO){ MINFO("z is 0, try again"); goto try_again; } //In Line 255, 403 Scalar e = transcript.challenge("e"); if(e == ZERO){ MINFO("e is 0, try again"); goto try_again; } //In Line 286, 409 Scalar e1 = transcript.challenge("e"); if(e1 == ZERO){ MINFO("e is 0, try again"); goto try_again; } ``` ~~- **An input to the resize() may lead to segmentation fault.** **->** In Line 161, before using resize function, the input (M*N + 2) needs to be checked if it is not equal to 0. **Potential vulnerability:** when n=0, the function will return an empty vector. If a user tries to access an empty vector, this will lead to a segmentation fault. **Recommendation:** Throw an error message if (M*N + 2) = 0. - ~~**An input to the resize() may lead to segmentation fault.** **->** In Line 170, we have~~ ``` d.resize(M*N); d[0] = z_square; ``` ~~**Recommendation:** For the possible segmentation fault, it's better to write. ```~~ d.resize(M*N); CHECK_AND_ASSERT_THROW_MES(M*N != 0, "Need M*N to be non-zero"); d[0] = z_square; ``` ~~ -~~ **Add some checks** **->** In BPPlus::prove(), check~~ ``` if(unpadded_v.size() == 0){ throw std::invalid_argument("unpadded v is empty"); } ``` 2- ~~In src/libspark/bech32.cpp We have the following function.~~ ``` uint32_t polymod(const data& values) ``` ~~This function uses a for loop which is: ~~``` for (const auto v_i : values) ``` ~~The function polymod() is used by verify_checksum() and create_checksum(), which are used in the functions decode() and encode() repspectively. My consideration is that encode() function and decode() function maybe used later for some nonce and secret data. Unless these two functions are used by a secret or a nonce, these are dependent of user-controlled input. Can we related to Heartbleed bug?~~ ~~Instead of using the input dependent from the values in the function ~~``` for (const auto v_i : values) ``` ~~Isn't it better to use the following: ~~ ``` std size_t values_count = values.size(); for (int i=0; i<values_count; i++){ // use v[i] here. } ``` 3- ~~In src/libspark/chaum.cpp use of ```-``` instead of ```negate()``` which result into passing negative value in `std::vector<Scalar> scalars;`~~ ~~in line 115 `F_scalar -= proof.t1[i];`~~ ~~in line 121 ` scalars.emplace_back(proof.t2.negate() - w*proof.t2); `~~ 4- **In src/libspark/keys.cpp** In Line 187, Same consideration as in the 2^nd possible finding. The following code may have secret-dependent loop bound. The "buffer" looks like its dependent to incoming_viewing_key. for (const auto b : buffer) { ``` for (const auto b : buffer) { ss << (b >> 4); ss << (b & 0xF); } ``` It might be better have the following first, and then loop. ``` size_t buffer_size = buffer.size(); ``` 5- **In src/libspark/coin.cpp** In Line 156, if-statement is dependent on the secret data (incoming_view_key), which may cause on timing attacks. ``` if (!validate(incoming_view_key, data)) { throw std::runtime_error("Malformed coin"); } ``` What about Line 90, 95, 102? 6- **In src/libspark/keys.cpp** In Line 65, if-statement is dependent on the secret data, which is s1, s2, r (which they construct spending_key) ``` if (this->s1 != other.s1 || this->s2 != other.s2 || this->r != other.r) return false; return true; } ``` 7- **Not a Finding** For the statement "Avoiding lookups indexed by secret value", it looks like there is just one lookup table in src/libspark. It's in bech32.cpp in decode() function. ~~8- in file src/bpplus.cpp incorrect checking of pederson commitment (??)~~ ~~`if (!(G*v[j] + H*r[j] == C[j]))`~~ ~~instead it will be (??)~~ ~~`if (!(Gi[j]*v[j] + Hi[j]*r[j] == C[j]))`~~ 9- in src/bpplus.cpp line 140 instead of `A_points.reserve(2*N*M + 1);` it will be reserved for A_scalars ie, `A_scalars.reserve(2*N*M + 1);` 10- **In src/libspark/bech32.cpp**, The bit operations in line 116, needs to be typecasted (???). If it's not typecasted, the bitwise left-shift operator can be undefined and may cause the runtime error. Should we treat it as the following? ``` uint8_t c0 = (uint8_t) c >> 25; ``` 11- **comment** How the associated range proof for a coin is checked is not clear. in file coin.cpp, it assumes coin has a valid associated range proof (line 84) 12- ~~missing check for "sum of public input value = sum of output value" in file mint_transaction.cpp~~ 13- **comment/concern** use of `std::random_device generator;` for PRNG in file grootle.cpp line 420 might not be good for cryptographic purpose. 14- **In src/libspark/bpplus.cpp**, Results of the elliptic curve operations needs to be checked to be valid. In Line 151, ``` proof.A = A_multiexp.get_multiple(); ``` In Line 248, 249, ``` L_ = L_multiexp.get_multiple(); R_ = R_multiexp.get_multiple(); ``` In Line 517, ``` return multiexp.get_multiple().isInfinity(); ``` In 260, 261, ``` Gi1[i] = Gi1[i]*e_inverse + Gi1[i+N1]*(e*y_N1_inverse); Hi1[i] = Hi1[i]*e + Hi1[i+N1]*e_inverse; ``` In 281, 282, ``` proof.A1 = Gi1[0]*r_ + Hi1[0]*s_ + G*(r_*y*b1[0] + s_*y*a1[0]) + H*d_; proof.B = G*(r_*y*s_) + H*eta_; ``` In these lines, it is better to check whether the elliptic curve operations are valid or not. So, you need to include the following functions from src/secp256k1/include/GroupElement.h ``` isMember(); isInfinity(); ``` In the following line numbers, its better to include the following function from src/secp256k1/include/Scalar.h ``` isMember(); ``` In Line 288, 289, 290 ``` proof.r1 = r_ + a1[0]*e1; proof.s1 = s_ + b1[0]*e1; proof.d1 = eta_ + d_*e1 + alpha1*e1.square(); ``` If the above are correct to check, we'll have more to check in other files such as chaum.cpp and others. ~~15- **In src/spark/primitives.cpp** In Line 5, ``` uint256 CSparkMintMeta::GetNonceHash() const { if(!nonceHash) nonceHash.reset(primitives::GetNonceHash(k)); return *nonceHash; } ``` if statement is controlled by hash(nonce). Should we avoid it? (Is it the same problem without hashing?)~~ ~~16- **In src/spark/sparkwallet.cpp**, In Line 24, ``` if (!walletdb.readFullViewKey(fullViewKey)) ``` If statement is controlled by the secret data (fullViewKey) **Secondly, in Line 343,** ``` for (const auto& lTag : lTags) ``` Is **lTags** something secret? (**Need to look at it again.**) If it is, so you should avoid secret data for loop bounds. **Thirdly, in Line 1017,** ``` if (GetRandInt(10) == 0) { tx.nLockTime = std::max(0, static_cast<int>(tx.nLockTime) - GetRandInt(100)); } ``` Does "GetRandInt" function needs to be secure?~~ 17- ~~In src/spark/state.cpp ~~In Line 895,~~ ``` if (sparkState.IsUsedLTag(tag)) ``` ~~Is tag private? If so, "if statement is controlled by~~ ~~secret-data".~~ ~~18- in spent_transaction a. check if w, t >0, b. same balance witness works for two different balance statement where f+vout = f'+vout'. (line 170) c. negative balance witness (line 168)??~~ 19- In `Scalar.cpp`, `randomize` should be able to return 0. In particular, line 220 is incorrect. 20- ~~in chaup.cpp, on line 69,94 `challenge` should also include the public parameters F, G, H, U while generating the challenge c.~~ 21- comment: line 141 chaum.cpp, instaed of passing A2 as tuple in the verification, sum of `A2[i]` can be passed in points and `w` can be passed in scalars, thus saving n-1 spaces. 22- check missing in double pederson commitment line 64, grootle.cpp len(Gi)==len(a), len(Hi)==len(b) 23- need to discuss about the vout (read end para of page 13 of spark paper(Note that it is possible to modify...) ) 24- In MintTransaction::MintTransaction, if generate is false, this->coins will be populated with uninitialised Coin objects, and these have numerous uninitialised fields, which would be UB to read from, for example when MintTransaction::getMintedCoinsSerialized is called from CSparkWallet::CreateSparkMintRecipients, in which UB is always invoked if generate is false and outputs is not empty. All fields must be given default values in Coin’s type definition. 25- should we also include S and V in generating the chalenge in grootle.cpp line 170? 26- **In libspark/keys.cpp**, In **line 254**, ``` bech32::convertbits(bit_converted, scrambled, 8, 5, true); ``` Here, scrambled is. ``` std::vector<unsigned char> scrambled = F4Grumble(network, raw.size()).encode(raw); ``` But, bech32::convertbits() accepts std::vector<uint8_t>. ``` bool convertbits(std::vector<uint8_t>& out, const std::vector<uint8_t>& in, int frombits, int tobits, bool pad) ``` May it cause to a problem?