Platform: Code4rena
Start Date: 14/09/2022
Pot Size: $50,000 USDC
Total HM: 25
Participants: 110
Period: 5 days
Judge: hickuphh3
Total Solo HM: 9
Id: 162
League: ETH
Rank: 3/110
Findings: 12
Award: $4,430.88
🌟 Selected for report: 2
🚀 Solo Findings: 1
🌟 Selected for report: carrotsmuggler
Also found by: 0x52, 0xDecorativePineapple, 0xPanas, Bahurum, Jeiwan, Lambda, PwnPatrol, R2, Respx, auditor0517, durianSausage, hyh, ladboy233, pauliax, scaraven, teawaterwire, zzzitron
36.6124 USDC - $36.61
When priceFeed1.decimals() == 8
(USD pairs), PegOracle
correctly scales the price to 8 decimals. However, when the decimals of both feeds have 18 decimals (ETH pairs), the scaling is not performed correctly, causing prices to be 0, causing Controller.getLatestPrice
to always revert.
Let's say a vault for AAVE/BTC should be deployed. There is no such Chainlink feed, but there is a feed for AAVE/ETH (https://etherscan.io/address/0x6df09e975c830ecae5bd4ed9d90f3a95a4f88012) and BTC/ETH (https://etherscan.io/address/0xdeb288f737066589598e9214e782fa5a8ed689e8). As one can see on the etherscan pages, these both returns prices with 18 decimals (as every */ETH pair does).
If such a configuraiton is used, nowPrice
will be scaled by 0 decimals (as decimals10 = 10 ** 0 = 1
), meaning it will have 4 decimals. latestRoundData()
returns then nowPrice / 1000000
, which scales the price to -2 decimals, meaning it will be 0 in most cases and cause Controller.getLatestPrice
to revert.
Because nowPrice
has always 4 decimals in line 73 (note that we divide the prices, which have the same amount of decimals and multiply this by 10^4 in the lines above), it should be multiplied by 10**4, no matter how many decimals the price feed has.
#0 - HickupHH3
2022-10-18T10:10:41Z
dup #195
125.2594 USDC - $125.26
https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/Controller.sol#L169 https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/Controller.sol#L225
In Controller.triggerEndEpoch
and Controller.triggerDepeg
, the transfers between the two vaults are initiated, even if one of the wallets had no deposits. This is a situation that can occur during normal operation. For instance, there might have been no demand for the insurance of a particular peg, but there were still capital providers. In such a situation, the money will be lost without a way to recover it, because no one can call withdraw
on a vault that had no deposits (and no further deposits for this epoch are possible).
Let's say a fund provides $1 million to insure against a depeg of USDC/USD. However, there is no demand for this insurance, because people think that USDC is safe. 1 week before the end of the epoch, a depeg event still occurs and a bot calls Controller.triggerDepeg
. Now, this million is transferred from the risk vault to the insurance vault. However, it will be locked up forever in there, because nobody can call withdraw
with this epoch endtime.
Do not initiate a transfer to vaults where no one deposited for the given epoch.
#0 - 3xHarry
2022-09-22T10:26:08Z
dup #409
🌟 Selected for report: 0xDecorativePineapple
1066.9441 USDC - $1,066.94
Because PegOracle
first divides and then multiplies again, there is a significant loss of precision. This can cause an asset to be marked as depegged, although it is not depegged, meaning a payout is triggered when it should not.
Let's say a vault is configured with a strikePrice
of 99999000, i.e. 0.99999. The current ratio between the two assets is 0.999991, i.e. above the peg. However, nowPrice
in line 68 will be set to 9999, as this is the maximum precision for this line. This will then be multiplied with 10^10 and divided by 1000000 again, i.e. 99990000. The returned value is therefore considered to be below the peg (although the real ratio is not), which will cause a depeg event.
Multiply first before dividing to keep the precision.
#0 - MiguelBits
2022-10-03T20:55:15Z
#1 - HickupHH3
2022-10-18T09:47:14Z
dup #323
388.9011 USDC - $388.90
When PegOracle
is configured for tokenA/tokenB, it reports tokenA/tokenB OR tokenB/tokenA ratios, depending on which token is more valuable. This can lead to undesired depeg events and cause a monetary loss for the risk takers.
Let's say the Oracle is configured with stETH/USD and ETH/USD. Furthermore, vaults are deployed to pay out when the stETH/ETH peg falls below 0.95. If for some reason stETH suddenly appreciates in value, while ETH does not and the stETH/ETH peg goes above 1.05, the oracle will report a peg of less than 0.95, which will cause a depeg event. This is not intended, the vault was inteded to insure against situations where stETH/ETH falls below 0.95, not where it raises above 1.05
Furthermore, if a direct stETH/ETH oracle would be used, this would not have happened. It should not be that the correct functionality of the protocol depends on the fact if a direct oracle or a PegOracle
is used.
Change the logic for PegOracle
such that only one direction is considered.
#0 - HickupHH3
2022-11-01T13:43:16Z
388.9011 USDC - $388.90
https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/Vault.sol#L226 https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/SemiFungibleVault.sol#L205
EIP-4626 states the following:
The convertTo functions serve as rough estimates that do not account for operation specific details like withdrawal fees, etc. They were included for frontends and applications that need an average value of shares or assets, not an exact value possibly including slippage or other fees. For applications that need an exact value that attempts to account for fees and slippage we have included a corresponding preview function to match each mutable function. These functions must not account for deposit or withdrawal limits, to ensure they are easily composable, the max functions are provided for that purpose.
However, this is not true for the vaults previewWithdraw
. This function returns the amount that the user will get plus the withdrawal fees that are tranferred to the treasury.
Rationale for high risk: In a previous contest, a similar issue was marked high risk with the following reasoning:
EIP4626 is aimed to create a consistent and robust implementation patterns for Tokenized Vaults. A slight deviation from 4626 would broke composability and potentially lead to loss of fund (POC in https://github.com/code-423n4/2022-06-notional-coop-findings/issues/88 can be an example). It is counterproductive to implement EIP4626 but does not conform to it fully.
I agree with that reasoning. When other smart contracts try to integrate with these vaults (which is the motiviation for having these standards, after all), this could lead to significant problems, including a loss of funds. For instance, a third-party contract could transfer the amount that is returned by previewWithdraw
to the user that requested the withdrawal. But because the smart contract receives less assets, this will lead to one of two situations:
1.) The asset balance of the smart contract is enough to cover the withdrawal fee: In such a situation, the user would steal assets that belong to other users.
2.) The asset balance is not enough: No withdrawals are possible, the funds are stuck.
Subtract the fees in previewWithdraw
.
#0 - 3xHarry
2022-09-22T10:27:31Z
@MiguelBits i agree with this
#1 - HickupHH3
2022-10-17T09:51:42Z
dup of #47
🌟 Selected for report: csanuragjain
Also found by: Lambda, R2, bin2chen, datapunk, rbserver, unforgiven
300.0094 USDC - $300.01
Judge has assessed an item in Issue #211 as High risk. The relevant finding follows:
#0 - HickupHH3
2022-11-05T15:28:33Z
When block.timestamp = epochEnd, both triggerDepeg and triggerEndEpoch is potentially callable, because the functions require that the timestamp is <= or >=, respectively. When the depeg happens exactly at this timestamp, there is a huge incentive for capital providers to call triggerEndEpoch, whereas the insured persons want to call triggerDepeg. It is recommended to change one of the functions to enforce a strict inequality to avoid such situations where frontrunning can be very profitable.
dup of #69
320.0832 USDC - $320.08
https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/Controller.sol#L289 https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/VaultFactory.sol#L222
VaultFactory
stores the oracle addresses in tokenToOracle
with the address of the token as the key. While this works well as long as a token is always used in combination with the same token (in which case one oracle per token is sufficient), it leads to problems as soon as multiple pairs for the same token are introduced.
The administrator deploys a vault for stETH/ETH with a corresponding oracle. Later on, users also want to insure against stETH/cbETH depegs. The admin also deploys this vault via VaultFactory.createNewMarket
. However, no matter what he provides as an _oracle
in this call, it will not be used (as tokenToOracle[address(stETH)]
is already used. The admin could call changeOracle
, but this would break the existing stETH/ETH vault (as he would now use the new stETH/cbETH oracle).
The oracle should be looked up with the marketID and not with the token. Like that, using the same token in multiple vaults is no problem.
#0 - HickupHH3
2022-10-31T13:49:51Z
Agree with warden; protocol availability / functionality is impacted if token
is used as the key for the mapping as it restricts the number of market pairs that can be added.
2 — Med: Assets not at direct risk, but the function of the protocol or its availability could be impacted
#1 - HickupHH3
2022-11-01T14:04:03Z
selecting #100 as primary instead for its slightly clearer explanation and example.
🌟 Selected for report: Lambda
Also found by: Deivitto, R2, Rolezn, csanuragjain
202.2286 USDC - $202.23
https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/SemiFungibleVault.sol#L94 https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/Controller.sol#L168 https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/Controller.sol#L225
Certain tokens (e.g., STA or PAXG) charge a fee for transfers and others (e.g., USDT or USDC) may start doing so in the future. This is not correctly handled in multiple places and would lead to a loss of funds:
1.) SemiFungibleVault.deposit
: Here, less tokens are transferred to the vault than the amount of shares that is minted to the user. This is an accounting mistake that will ultimately lead to the situation where the last user(s) cannot withdraw anymore, because there are no more assets left.
2.) Controller.triggerDepeg
& Controller.triggerEndEpoch
: sendTokens
tries to send the whole asset balance to the other contract, which will fail when less tokens are available at this point (because the previous accounting was done without incorporating fees). This will mean that the end can never be triggered and all assets are lost.
When fee-on-transfer tokens should be supported, you need to check the actual balance differences. If they are not supported, this should be clearly documented.
#0 - HickupHH3
2022-10-31T14:15:53Z
Valid because SemiFungibleVault may not be applied to WETH only (unlike Vault), but generic tokens. Furthermore, if there is an intention to make semi fungible vaults an EIP standard, then one may have to consider catering to FoT tokens as well, unless explicitly stated otherwise.
🌟 Selected for report: Lambda
1541.1415 USDC - $1,541.14
In notifyRewardAmount
, the reward rate per second is calculated. This calculation rounds down, which can lead to situations where significantly less rewards are paid out to stakers, because the effect of the rounding is multiplied by the duration.
Let's say we have a rewardsDuration
of 4 years, i.e. 126144000 seconds. We assume the rewardRate
is currently ß and notifyRewardAmount
is called with the reward amount 252287999. Because the calculation rounds down, rewardRate
will be 1.
After the 4 years, the user have received 126144000 reward tokens. However, 126143999 (i.e., almost 50%) of the reward tokens that were intended to be distributed to the stakers were not distributed, resulting in monetary loss for all stakers.
You could accumulate the differences that occur due to rounding and let the users claim them in the end according to their shares.
#0 - HickupHH3
2022-11-01T13:51:36Z
While it's an edge case, the numbers used in the POC are reasonable if we consider small token decimals (eg. EURS with 2 decimals).
8.0071 USDC - $8.01
PegOracle
uses the latest price of both feeds to calculate the peg. However, it is not validated that these prices have the same age. If one of the prices is much older, it can happen that the Oracle returns a peg which never existed.
Furthermore, latestRoundData
always returns timeStamp1
, although timeStamp2
might be older, leading to wrong freshness assumptions about the price.
Let's say the Oracle is configured with stETH/USD and ETH/USD. If there is a strong movement for both ETH & stETH (as it usually happens), but only the ETH/USD oracle is updated, the oracle will return a strong deviation from the peg, although the real peg might be for example 0.98
1.) Return min(timeStamp1, timeStamp2)
2.) Check that the prices that are used are actually from the same period to avoid situations where an outdated oracle causes a depeg event, when no one happened in reality.
#0 - HickupHH3
2022-11-01T13:38:09Z
dup of #61, even though it tackles a different angle of relative freshness. A good check would be to compare against the current timestamp as suggested by the primary.
I do like the suggestion of returning the min of timestamp1 and timestamp2 though.
🌟 Selected for report: Respx
Also found by: 0x1f8b, 0xDecorativePineapple, 0xNazgul, 0xPanas, 0xSmartContract, 0xc0ffEE, 0xmuxyz, Aymen0909, Bahurum, Bnke0x0, CodingNameKiki, Deivitto, Jeiwan, Lambda, Picodes, PwnPatrol, R2, RaymondFam, Rolezn, Ruhum, Saintcode_, SooYa, Tointer, V_B, ajtra, ak1, async, auditor0517, brgltd, c3phas, carrotsmuggler, cccz, csanuragjain, datapunk, djxploit, durianSausage, eierina, erictee, gogo, imare, joestakey, jonatascm, kv, ladboy233, leosathya, lukris02, oyc_109, pashov, pauliax, rbserver, robee, rokinot, rvierdiiev, scaraven, simon135, unforgiven, wagmi, zzzitron
36.6223 USDC - $36.62
block.timestamp = epochEnd
, both triggerDepeg
and triggerEndEpoch
is potentially callable, because the functions require that the timestamp is <= or >=, respectively. When the depeg happens exactly at this timestamp, there is a huge incentive for capital providers to call triggerEndEpoch
, whereas the insured persons want to call triggerDepeg
. It is recommended to change one of the functions to enforce a strict inequality to avoid such situations where frontrunning can be very profitable.Vault.depositETH
only makes sense (and works) when the underlying asset is WETH. Consider checking that and including a helpful error when the asset is not WETH, but the user still calls the functions.Vault.getNextEpoch
assumes that the epochs
array is chronologically sorted, which is not enforced. If this is not the case, the returned value will be wrong.RewardsFactory.createStakingRewards
can be called for a pair where hashedIndex_StakingRewards[hashedIndex]
already exists. Consider checking that there are no staking awards yet for the pair. Changes to this dict could confuse third-party applications which assume that the staking reward addresses for a given pair remain constant (which makes sense in my opinion).🌟 Selected for report: pfapostol
Also found by: 0x040, 0x1f8b, 0x4non, 0xNazgul, 0xSmartContract, 0xc0ffEE, 0xkatana, Aymen0909, Bnke0x0, Deivitto, Diana, JAGADESH, KIntern_NA, Lambda, MiloTruck, R2, RaymondFam, Respx, ReyAdmirado, Rohan16, RoiEvenHaim, Rolezn, Ruhum, Saintcode_, Samatak, Sm4rty, SnowMan, Tomio, Tomo, WilliamAmbrozic, _Adam, __141345__, ajtra, ak1, async, c3phas, ch0bu, cryptostellar5, d3e4, delfin454000, dharma09, djxploit, durianSausage, eierina, erictee, fatherOfBlocks, gianganhnguyen, gogo, ignacio, imare, jag, jonatascm, leosathya, lukris02, malinariy, oyc_109, pashov, pauliax, peanuts, peiw, prasantgupta52, robee, rokinot, rotcivegaf, rvierdiiev, seyni, simon135, slowmoses, sryysryy, tnevler, zishansami
16.1756 USDC - $16.18
i++
can be replaced with ++i
in Vault.getNextEpoch
and the loop increase can be marked as unchecked
(https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/Vault.sol#L442)totalAssets(id) == totalSupply(id)
in Vault
, the calculations that are performed by the convertTo
functions in SemiFungibleVault
are a waste of gas for this vault type. Consider overwriting them and simply returning the shares / assets amount, which would save gas.SafeMath
in StakingRewards
(https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/rewards/StakingRewards.sol#L29) is unnecessary and a waste of gas when using Solidity 0.8.x#0 - HickupHH3
2022-11-08T15:08:30Z
Using SafeMath in StakingRewards (https://github.com/code-423n4/2022-09-y2k-finance/blob/bca5080635370424a9fe21fe1aded98345d1f723/src/rewards/StakingRewards.sol#L29) is unnecessary and a waste of gas when using Solidity 0.8.x
There is certainly deployment gas costs reduction of about 52k gas. however, the test suite did not test the staking functionality, so I'm unable to ascertain what the expected gas savings would be. I'd cap it to be around 5k gas.