This research document describes the theoretical models to realize support for arbitrary fractional decimal places within the Zeitgeist protocol. # Problem When a token has a number of fractional decimal places that does not equal to `10`, the `log10` of the so called [`BASE`](https://github.com/zeitgeistpm/zeitgeist/blob/c347f33c37838797be7323a52ed64b6ef14d4241/primitives/src/constants.rs#L40), then the math functions used within [zrml-swaps](https://github.com/zeitgeistpm/zeitgeist/blob/c347f33c37838797be7323a52ed64b6ef14d4241/zrml/swaps/src/fixed.rs) are unable to provide proper results, as the base used within the fixed math operations is fixed to `BASE` for any parameter of the function. # Solutions Two solutions with different advantages and disadvantages are presented here. The first solution is to adjust the fixed point math functions to be capable to handle numbers with different numbers of fractional decimal places. The second solution is to align the number of fractional decimal places for any token to `base = log10(BASE)`. ## Solution 1: Update fixed point math This solution aims to update the fixed point math to be capable to handle any number of fractional decimal places, including arithmetic operations on multiple numbers with multiple fractional decimal places. This solution has the advantage that done properly, it is very accurate. One potential solution could be to align the fractional decimal places of all the numbers that are used with the number that has the highest number of fractional decimal places. This will effectively reduce the range of available integers. As most of the integer part is so vast (approximately $128 - \frac{log(10)}{log(2)} = 94$ bits) that it is not used anyways, it should pose no problem. However, the implementation would change the core math used in swap pools which would consequently would require extensive testing. The overall complexity of the math functionality would rise and thus the chance to introduce a new security flaw should not be neglected. In addition to that, any consumer of the protocol, such as a frontend application, has to consider the different cases for all representations and present them properly. ## Solution 2: Align fractional decimal places The second solution aims to align all the fractional decimal places to `base`. Native tokens, those that are generated locally and are not related to any tokens outside of the consensus system, always have `base` fractional decimal places. Foreign tokens on the other hand can have any number of fractional decimal places. Usually the fractional decimal places are around 6 (USDT) and 18 (DAI). Since the adjustment only target foreign tokens, there are two situations when alignment happens: 1. Tokens enter the parachain via XCM 2. Tokens leave the parachain via XCM In both those situations alignment has to happen. ### Scenarios The different situations when alignment happens are inspected here. The column headers specify whether the number of fractional decimal places of the incoming token are greater or less than `base` | | <center>places > base</center> | <center>places < base</center> | | ------------ | ------------ | -------- | | **Incoming** | Cut off last decimal places.<br/>Those are considered lost. | Extend by missing decimal places.<br/>Balance remains the same | | **Outgoing** | Extend by missing decimal places.<br/>Balance remains the same | Cut off last decimal places.<br/>Those are considered lost. | ### Evaluation It becomes evident that this solution is not lossless, as depending on the number of fractional decimal places, there are some tokens lost during either receival or tranferal of foreign tokens. However, one of those situations can be disregarded as it is equal to rounding using the `floor` strategy. In case the token is extended during receival (`places < base`), all additional decimal places only add additional precision. When transfering the tokens out of the chain, this precision is removed by cutting of the initially added fractional decimal places which effectively resembles a round by flooring operation. Thus the only situation when tokens are actually lost is when foreign tokens with more `places` than `base` enter the chain (`places >base`). In this case, the lowest `places - base` places are cut off, thus the lost tokens can be considered dust. Would the receiving token have a value of $10,000,000 per token, then the lost tokens would amount to less than $0.002 in value. Consequently, the lost tokens can be considered irrelevant. As long as the total issuance is not increased, which is not the case using the approach described in this section, the solution can be applied.