Platform: Code4rena
Start Date: 24/10/2023
Pot Size: $149,725 USDC
Total HM: 7
Participants: 52
Period: 21 days
Judge: ronnyx2017
Total Solo HM: 2
Id: 300
League: ETH
Rank: 3/52
Findings: 2
Award: $15,736.55
🌟 Selected for report: 1
🚀 Solo Findings: 1
🌟 Selected for report: 0xBeirao
15619.0396 USDC - $15,619.04
eBTC borrows the corrected stake mechanism from Liquity for reward distributions. (paper)
Doing a pure collateral proportionnal redistribution will over-rewards fresh cdps, and under-rewards older troves. The redistribution must deal with the fact that some fraction of a cdp’s entire collateral is the accumulated reward from previous liquidations, and this fraction varies across cdps.
The corrected stake was introduced by Liquity to correct for this.
The intuition behind the choice of corrected stake is that the corrected stake effectively models the fresh trove’s collateral as a total collateral, which includes ‘virtual’ accumulated rewards. The corrected stake earns rewards for the trove as if the trove had been in the system from the beginning - thus maintaining proportional reward growth.
eBTC makes some modifications to the original design that make the protocol vulnerable to an order break in the sorted list.
Let’s explain why.
Initially totalStake == totalCollateral
, this is because no fees have been taken yet. Then, as the stETH value increases the protocol takes 50%. (if stETH apr is 5% then eBTC takes 2,5% as fees): CdpManagerStorage.sol#L509-L521
function _calcSyncedGlobalAccounting( uint256 _newIndex, uint256 _oldIndex ) internal view returns (uint256, uint256, uint256) { if (_newIndex > _oldIndex && totalStakes > 0) { /// @audit-ok We don't take the fee if we had a negative rebase ( uint256 _feeTaken, uint256 _deltaFeePerUnit, uint256 _perUnitError ) = calcFeeUponStakingReward(_newIndex, _oldIndex); // calculate new split per stake unit uint256 _newPerUnit = systemStEthFeePerUnitIndex + _deltaFeePerUnit; return (_feeTaken, _newPerUnit, _perUnitError); } else { return (0, systemStEthFeePerUnitIndex, systemStEthFeePerUnitIndexError); } }
The _feeTaken
goes directly to the protocol and reduces the totalCollateral
value. (because internal accounting uses shares for internal accounting aka wstETH) CdpManagerStorage.sol#L585:
activePool.allocateSystemCollSharesToFeeRecipient(_feeTaken);
Because the stake is calculated like this: CdpManagerStorage.sol#L454
stake = (_coll * totalStakesSnapshot) / totalCollateralSnapshot;
Now the new CDP stake will be higher than old cdp stakes since totalStakesSnapshot
has not moved but totalCollateralSnapshot
has decreased.
⇒ Since the debt redistributions and fee payments depend on the stake to distribute funds, we can intuitively understand that new CDPs will pay more fees than the old ones. Therefore, new cpds NICR (Nominal Individual Collateral Ratio = shares/debt) will decrease at a higher rate than old ones.
This can cause old CDPs to cross newer ones and break list order.
This list order break make the insertion position not unique anymore, this can lead to more and more disorder in the list that leads to unexpected behaviours such as DOS and bad redemption order.
This poc only takes advantage of splitting fees to break the list order, but note that debt redistribution events exacerbate this problem.
function test_H1() public { vm.pauseGasMetering(); address firstUser; address lastUser; bytes32 firstCdpId; bytes32 lastCdpId; uint loop = 100; /// let's open 100 cdps and save the 1st and last index for (uint256 i = 0; i < loop; i++) { (uint256 oldIndex, uint256 newIndex) = _applyIndexChange(2000000000000000); // 0,2% stETH increase // get a random user address user = _utils.getNextUserAddress(); vm.startPrank(user); // Randomize collateral amount used vm.deal(user, 10 ether * 1000); collateral.approve(address(borrowerOperations), 10 ether*1000); collateral.deposit{value: 10 ether * 1000}(); uint shareAmount = 10 ether; uint256 collAmount = collateral.getPooledEthByShares(shareAmount); uint256 borrowedAmount; if (i == loop - 1) // here we compute borrowedAmount to make lastCdpId NCIR very close to firstCdpId NICR borrowedAmount = (1e20 * (shareAmount - collateral.getSharesByPooledEth(2e17))) / cdpManager.getSyncedNominalICR(firstCdpId); // borrowedAmount = 0.65764 ether; else borrowedAmount = 0.5 ether; bytes32 id = borrowerOperations.openCdp(borrowedAmount, "hint", "hint", collAmount); if (i == 0) {firstUser = user; firstCdpId = id;} if (i == loop - 1) {lastUser = user ; lastCdpId = id;} vm.stopPrank(); } logNICR(firstCdpId, lastCdpId); // NICR 1st trove should be < NICR last trove assertLe(cdpManager.getSyncedNominalICR(firstCdpId), cdpManager.getSyncedNominalICR(lastCdpId)); /// Let's increase the stETH by 40% and open a last cdp (uint256 oldIndex, uint256 newIndex) = _applyIndexChange(400000000000000000); // 40% stETH increase // get a random user address user = _utils.getNextUserAddress(); vm.startPrank(user); uint256 collAmount = 10 ether; // deal ETH and deposit for collateral vm.deal(user, collAmount * 1000); collateral.approve(address(borrowerOperations), collAmount); collateral.deposit{value: collAmount * 1000}(); uint borrowedAmount = 0.5 ether; borrowerOperations.openCdp(borrowedAmount, "hint", "hint", collAmount); vm.stopPrank(); logNICR(firstCdpId, lastCdpId); // NICR 1st trove should be < NICR last cdp but it's not the case assertLe(cdpManager.getSyncedNominalICR(firstCdpId), cdpManager.getSyncedNominalICR(lastCdpId)); } function logNICR(bytes32 firstCdpId, bytes32 lastCdpId) public { console.log("---------------------------------- 1st cdp"); console.log("getCdpStake : ", cdpManager.getCdpStake(firstCdpId)); console.log("getCdpCollShares : ", cdpManager.getCdpCollShares(firstCdpId)); console.log("getSyncedNominalICR : ", cdpManager.getSyncedNominalICR(firstCdpId)); console.log("---------------------------------- last cdp"); console.log("getCdpStake : ", cdpManager.getCdpStake(lastCdpId)); console.log("getCdpCollShares : ", cdpManager.getCdpCollShares(lastCdpId)); console.log("getSyncedNominalICR : ", cdpManager.getSyncedNominalICR(lastCdpId)); console.log("---"); console.logInt(int(int(cdpManager.getSyncedNominalICR(firstCdpId))-int(cdpManager.getSyncedNominalICR(lastCdpId)))); console.log("----------------------------------"); }
This test fails on the last assertion, meaning that the list is no longer sorted.
This is hard to fix because the sorted list was not designed to work this way.
I haven't been able to find a quick fix that preserves the list order with a fair redistribution.
Even if it is less fair for borrowers, a proportional distribution (without the corrected stake) can solve this problem but it will be an incomplete solution.
The eBTC team needs to rethink the fee/debt redistribution to maintain the list order.
Math
#0 - c4-pre-sort
2023-11-15T09:11:28Z
bytes032 marked the issue as insufficient quality report
#1 - c4-pre-sort
2023-11-17T07:38:05Z
bytes032 marked the issue as remove high or low quality report
#2 - c4-pre-sort
2023-11-17T07:38:10Z
bytes032 marked the issue as sufficient quality report
#3 - c4-pre-sort
2023-11-17T07:38:16Z
bytes032 marked the issue as duplicate of #63
#4 - GalloDaSballo
2023-11-23T11:49:09Z
We are still looking into this finding which:
#5 - GalloDaSballo
2023-11-23T17:31:54Z
The impact is incorrect
The way Stake changes incorrectly causes fee split is valid
We're investigating impact and consequences
#6 - GalloDaSballo
2023-11-23T17:32:19Z
Not a dup of 63 this seems to be a solo finding
#7 - c4-judge
2023-11-25T11:51:29Z
jhsagd76 marked the issue as not a duplicate
#8 - jhsagd76
2023-11-27T09:33:12Z
I mark this as satisfactory but will wait for the final investigate result from sponsor.
#9 - c4-judge
2023-11-27T09:33:30Z
jhsagd76 marked the issue as satisfactory
#10 - jhsagd76
2023-11-27T11:26:23Z
@GalloDaSballo
#11 - GalloDaSballo
2023-11-27T12:34:00Z
The conclusion is wrong
But the finding showed that the feeSplit math is incorrect, which is a severe finding
#12 - c4-judge
2023-11-28T03:27:50Z
jhsagd76 marked the issue as selected for report
#13 - rayeaster
2023-11-29T02:10:39Z
It is more like a precision issue using Solidity (sth like $\frac{(x -z) + (z - y)}{c} \neq \frac{x - z}{c} + \frac{z - y}{c}$) instead of a fundamental design problem.
The following statement is worth more thinking:
we can intuitively understand that new CDPs will pay more fees than the old ones. Therefore, new cpds NICR (Nominal Individual Collateral Ratio = shares/debt) will decrease at a higher rate than old ones
Yes, higher stake leads to more fees, but it is NOT equal to "decrease at a higher rate", actually the relative ratio of NICR between two CDPs always keep the same "theoretically" by the stake mechanism:
totalStakeSnapshot
is $S_{n}$ which is not changed by split fee claimT0
, totalCollateralSnapshot
is $C_{n0}$T1
, a $CDP_{1}$ is created with initial collateral share c
and debt d
thus its stake is $s_{1}=\frac{c * S_{n}}{C_{n0}}$T2
, a split fee claim happens and totalCollateralSnapshot
is changed to $C_{n1}$, note that $C_{n1} < C_{n0}$ and $C_{n0} - C_{n1} = S_{n} * \Delta{P_{1}}$ where $\Delta{P_{1}}$ is the delta change of fee per stake unitT3
, another $CDP_{2}$ is created with the same collateral share c
and debt d
but now its stake is $s_{2}=\frac{C * S_n}{C_{n1}} > s_{1}$NICR
of $CDP_{1}$ is $\frac{c - s_{1} * \Delta{P_{1}}}{d}$ and NICR
of $CDP_{2}$ is $\frac{c}{d} > NICR_{1}$, their NICR
ratio $r_{1}$ is $\frac{c - s_{1} * \Delta{P_{1}}}{c}$T4
(maybe after a long time), a split fee claim happens again and $\Delta{P_{2}}$ is the delta change of fee per stake unit for this timeNICR
of $CDP_{1}$ is $\frac{c - s_{1} * \Delta{P_{1}} - s_{1} * \Delta{P_{2}}}{d}$ and NICR
of $CDP_{2}$ is $\frac{c - s_{2} * \Delta{P_{2}}}{d}$, so their NICR
ratio $r_{2}$ is now $\frac{c - s_{1} * \Delta{P_{1}} - s_{1} * \Delta{P_{2}}}{c - s_{2} * \Delta{P_{2}}}$If $\frac{c - s_{1} * \Delta{P_{1}} - s_{1} * \Delta{P_{2}}}{c - s_{2} * \Delta{P_{2}}} = \frac{c - s_{1} * \Delta{P_{1}}}{c} \Rightarrow $
$(c - s_{1} * \Delta{P_{1}}) * (c - s_{2} * \Delta{P_{2}}) = c * (c - s_{1} * \Delta{P_{1}} - s_{1} * \Delta{P_{2}}) \Rightarrow $
$c^2 - c * s_{2} * \Delta{P_{2}} - c * s_{1} * \Delta{P_{1}} + s_{1} * s_{2} * \Delta{P_{1}} * \Delta{P_{2}} = c^2 - c * s_{1} * \Delta{P_{1}} - c * s_{1} * \Delta{P_{2}} \Rightarrow $
$s_{1} * s_{2} * \Delta{P_{1}} = c * (s_{2} - s_{1}) \Rightarrow \frac{s_{1}}{c} * \Delta{P_{1}} = 1 - \frac{s_{1}}{s_{2}} \Rightarrow$
$\frac{S_{n}}{C_{n0}} * \Delta{P_{1}} = 1 - \frac{C_{n1}}{C_{n0}} \Rightarrow S_{n} * \Delta{P_{1}} = C_{n0} - C_{n1} $
Proved
If we could manually syncAccounting
for the first CDP along the poc test path (i.e., always sync its collateral for every index increase in the loop) or simply set the loop number to 1000 instead of 100, the list order will not break at the end of the poc test.
And the break require some delicate conditions like significant increase of the Lido stETH index, e.g., 40% increase means about 10 years at the current staking reward apr
I suggest to mark it as "acknowledged"
#14 - jhsagd76
2023-11-30T12:09:14Z
Tks for the careful and clear math prove from @rayeaster , it's very helpful to show the underlying mechanism of the issue based. It's amazing. And I want to add a comment for s2 in T3. There is no need to worry about the impact of c and s1 on Sn and Cn1 in T1, as they grow proportionally. The mathematical calculations in T3 are completely correct, and you can further prove this. I have actually done it myself, but I'm not very good at writing mathematical formulas using Markdown. So, I'm only providing this conclusion to prevent any doubts about this.
And I add one line at the end of logNICR function in the original poc:
console.log(cdpManager.getSyncedNominalICR(firstCdpId) * 1e18 / cdpManager.getSyncedNominalICR(lastCdpId));
The console log can clearly show why the sponsor say it's precision issue:
Logs: before ---------------------------------- 1st cdp getCdpStake : 9800399201596806387 getCdpCollShares : 9800399201596806387 getSyncedNominalICR : 1791162047648499053200 ---------------------------------- last cdp getCdpStake : 10760678217311938700 getCdpCollShares : 9833333333333333333 getSyncedNominalICR : 1791162047648499053418 --- -218 999999999999999999 ---------------------------------- after ---------------------------------- current cdp getCdpStake : 7660143815713583482 getCdpCollShares : 6125000000000000000 getSyncedNominalICR : 1225000000000000000000 ---------------------------------- ---------------------------------- 1st cdp getCdpStake : 9800399201596806387 getCdpCollShares : 9800399201596806387 getSyncedNominalICR : 1567266791692436672600 ---------------------------------- last cdp getCdpStake : 10760678217311938700 getCdpCollShares : 9833333333333333333 getSyncedNominalICR : 1567266791692436672219 --- 381 1000000000000000000 ----------------------------------
In fact, this variation is not linear. It requires the growth of the index to occur at an extremely precise value in order to generate this error.
And I also want to provide a poc test for the amazing math prove from @rayeaster , tks him again.
function test_H2prove() public { vm.pauseGasMetering(); address cdp1_user; address cdp2_user; bytes32 cdp1_id; bytes32 cdp2_id; // stake someting to init { address _nobody = prepareUser(); vm.prank(_nobody); borrowerOperations.openCdp(1.337 ether, "hint", "hint", 31.415 ether); _applyIndexChange(1000000000000000); // +0.1% index } // T0 { cdpManager.syncGlobalAccountingAndGracePeriod(); uint Sn = cdpManager.totalStakesSnapshot(); uint Cn0 = cdpManager.totalCollateralSnapshot(); console.log(Sn, Cn0); } // const uint c = 10 ether; uint d = 0.5 ether; // T1 cdp1_user = prepareUser(); vm.prank(cdp1_user); cdp1_id = borrowerOperations.openCdp(d, "hint", "hint", c); // uint s1 = cdpManager.getCdpCollShares(cdp1_id); // console.log(s1); // T2 { int delta_p1 = 2000000000000000; // +0.2% index _applyIndexChange(delta_p1); cdpManager.syncGlobalAccountingAndGracePeriod(); // vm.prank(cdp1_user); // cdpManager.syncAccounting(cdp1_id); } // T3 cdp2_user = prepareUser(); vm.prank(cdp2_user); // c = 17 ether; // d = 1.1 ether; cdp2_id = borrowerOperations.openCdp(d, "hint", "hint", c); // uint s2 = cdpManager.getCdpCollShares(cdp2_id); // console.log(s2); // after T3, we should calc NICR rate uint wad = 1e18; { uint cdp1_NICR_T3 = cdpManager.getSyncedNominalICR(cdp1_id); uint cdp2_NICR_T3 = cdpManager.getSyncedNominalICR(cdp2_id); uint rate_T3 = cdp1_NICR_T3 * wad / cdp2_NICR_T3; console.log(rate_T3); } // T4 { int delta_p2 = 400000000000000000; // +40% index _applyIndexChange(delta_p2); cdpManager.syncGlobalAccountingAndGracePeriod(); // dump rate after T4 uint cdp1_NICR_T4 = cdpManager.getSyncedNominalICR(cdp1_id); uint cdp2_NICR_T4 = cdpManager.getSyncedNominalICR(cdp2_id); uint rate_T4 = cdp1_NICR_T4 * wad / cdp2_NICR_T4; console.log(rate_T4); // console.log(cdpManager.getCdpCollShares(cdp1_id), cdpManager.getCdpCollShares(cdp2_id)); } }
So, finally, I will mark this as QA, but waiting a double check from sponsors @GalloDaSballo @rayeaster
#15 - rayeaster
2023-11-30T13:08:45Z
@jhsagd76 Thanks for the great proof test!
It helps to spark an interesting idea. Do you think it would help to use sth like cdpManager.getSyncedNominalICR(firstCdpId) * 1e18 / cdpManager.getSyncedNominalICR(lastCdpId)
to compare the NICR as an additional check?
I mean we could say $NICR_{1}$ is larger than $NICR_{2}$ only if the following two conditions are satisfied at the same time:
#16 - jhsagd76
2023-11-30T15:08:48Z
Aha,I think this check may be idealistic. The poc we are currently discussing only involve a single rounding situation, but the actual scenario may be much more complex. Each syncAccounting after a rebase introduces rounding, but I believe this will actually alleviate the impact of the issue.
#17 - rayeaster
2023-12-01T02:18:03Z
If we add bad debt redistribution into the play, the ratio of NICR between CDPs would change but will NOT go to the extent that reverse the ordering, theoretically:
systemDebtRedistributionIndex
NICR
of $CDP_{1}$ is $\frac{c - s_{1} * \Delta{P_{1}}}{d + s_{1} * \Delta{d_{1}}}$, and NICR
of $CDP_{2}$ is $\frac{c}{d + s_{2} * \Delta{d_{1}}}$, let us prove that their NICR ratio still keep the ordering:$ratio_{NICR} = \frac{c - s_{1} * \Delta{P_{1}}}{c} * \frac{d + s_{2} * \Delta{d_{1}}}{d + s_{1} * \Delta{d_{1}}} = \frac{s_{1}}{s_{2}} * \frac{d + s_{2} * \Delta{d_{1}}}{d + s_{1} * \Delta{d_{1}}} = \frac{s_{1} * s_{2} * \Delta{d_{1}} + s_{1} * d}{s_{1} * s_{2} * \Delta{d_{1}} + s_{2} * d} \lt 1$
Proved
So the whole story would be:
#18 - jhsagd76
2023-12-04T23:40:27Z
The impact based on bad debt redistribution is valid, but it can't break the cdp order. Considering the fixes already introduced in Liquity, I suggest considering the impact of this issue in long-running systems. Therefore, I mark this issue as med.
#19 - c4-judge
2023-12-04T23:40:34Z
jhsagd76 changed the severity to 2 (Med Risk)
#20 - CodingNameKiki
2023-12-09T10:39:50Z
🌟 Selected for report: SpicyMeatball
Also found by: 0xBeirao, 7ashraf, LokiThe5th, OMEN, TrungOre, alexzoid, alpha, bdmcbri, ether_sky, fatherOfBlocks, ge6a, hihen, hunter_w3b, jasonxiale, ladboy233, lsaudit, niroh, nobody2018, nonseodion, peanuts, prapandey031, shaka, twcctop, twicek, wangxx2026
117.508 USDC - $117.51
The initial Liquity design has a redemption fee and a borrowing (issuance) fee that follow the same value.
Because the eBTC collateral is stETH, the fee taken is 50% of the yield generated by the stake, so the eBTC team decided to remove the initial Liquity borrowing fee. However, I believe that this fee is a key component of Liquity price stability.
Let’s first explain the Liquity design:
The borrowing fee acts as an initial cost for borrowers to obtain LUSD. Similar to the redemption fee, it automatically increases after each redemption, indicating a potential undervaluation of LUSD against USD. This fee gradually decreases in the absence of redemptions. Increased fees immediately reduce the attractiveness of new loans, acting as a disincentive to create additional LUSD when demand is insufficient.
Unlike an interest rate increase, a higher borrowing fee has an immediate impact on new loans. Existing loans are not directly affected, but if a loan is closed due to redemption, the owner may hesitate to reopen the position while the fee is high. This precaution ensures that the LUSD burned by redemptions isn't immediately replaced by new loans, contributing to the measured growth or contraction of the monetary base.
Intuitively, the issuance fee should be in the same range or even be identical to the current redemption fee. The reason is this: If the redemption mechanism succeeds to push a lower price back to parity, new loans will become more attractive […]. If the QTM holds, the resulting base rate will roughly correspond to the difference in percentage points between parity and the previous price level. By charging the same base rate as an issuance fee, we can basically cancel out the attractiveness gain for loans due to the restored peg.
(Source : https://www.liquity.org/blog/on-price-stability-of-liquity)
The point behind this issue is that one of the key element why LUSD is one of the most stable stablecoin out there is not implemented eBTC. Especially when the stablecoin is new and doesn't have a lot of usecases, the selling pressure can be high and cause a depeg (just like GHO right now). Implementing a borrowing fee that de-incentivizes new loans during a depeg can help maintain stability.
Context: eBTC is new, there is not so much to do in DeFi with it, so the selling pressure is high:
and so on…
As you can see, a borrowing fee can limit this effect by contracting the supply.
eBTC should implement a fixed borrowing fee and follow the initial Liquity design.
Another solution may be to charge a borrowing fee only if the redemption fee is greater than MIN_REDEMPTION_FEE_FLOOR
(0.5%). This way, in normal condition no borrowing fees are taken and a fee is charged when we need to limit new loans creation.
https://www.liquity.org/blog/on-price-stability-of-liquity
https://docs.liquity.org/faq/lusd-redemptions#how-does-lusd-closely-follow-the-price-of-usd
Other
#0 - c4-pre-sort
2023-11-17T14:53:10Z
bytes032 marked the issue as insufficient quality report
#1 - c4-sponsor
2023-11-20T09:06:50Z
GalloDaSballo (sponsor) disputed
#2 - GalloDaSballo
2023-11-20T09:06:57Z
Need actual math
#3 - GalloDaSballo
2023-11-22T12:37:59Z
Also redemption fees are 1% due to oracle drift (check cheatsheet)
#4 - jhsagd76
2023-11-27T07:55:24Z
Interesting, but I cannot classify these as med without sufficient evidence.
#5 - c4-judge
2023-11-27T07:55:31Z
jhsagd76 marked the issue as unsatisfactory: Insufficient proof
#6 - jhsagd76
2023-12-07T02:56:30Z
Elevate this issue to an valid QA and recommend the sponsor to consider this suggestion. Although I don't consider it a vulnerability or a specific functional error, I believe it would be beneficial for the protocal stability.
#7 - c4-judge
2023-12-07T02:56:52Z
jhsagd76 removed the grade
#8 - c4-judge
2023-12-07T02:56:57Z
jhsagd76 changed the severity to QA (Quality Assurance)
#9 - c4-judge
2023-12-07T02:57:05Z
jhsagd76 marked the issue as grade-a
#10 - jhsagd76
2023-12-07T03:25:37Z
Elevate this issue to an valid QA and recommend the sponsor to consider this suggestion. Although I don't consider it a vulnerability or a specific functional error, I believe it would be beneficial for the protocal stability.
Further explain this suggestion. Common over-collateralized peg assets typically incur fees for open position / borrow / mint, such as LUSD and YUSD, usually around 0.5%. However, eBTC does not have such fees because the protocol charges fees from the stETH rebase. However, I have noticed that while this mechanism enhances capital efficiency, it also introduces some issues difficult to mitigate. For example
If we consider this fee as a defense rather than a profit, we can set the fee at a low level to maintain capital efficiency. For example, according to the current stETH rate, 3.6% - 4%, div by 365, a fee about 0.01% sounds good, IMO.
kindly ping the sponors to ensure they won't miss this @GalloDaSballo @CodingNameKiki