My $0.01

I participated in the "The Standard" audit contest on Codehawks, and identified a High-severity vulnerability.
The issue is related to the usage of a loop over an unbounded data structure, which can cause different users to incur very different gas costs: so much so, that a late user might have to face gas costs so high to effectively discourage him to take part in the protocol AT ALL.

Of course, no protocol wants to draw users away simply through poor coding practices, so the vulnerability was validated, and now I can share with you the details, my proposed fix, and how it can help save up to 63.3% more gas than the original code.

Here it goes, this is the report I submitted for the challenge.
Enjoy the read, and merry hacking ;)

The consolidation of existing stakes, performed before increasing a position, can de-incentivize the staking activity

Summary

In order to participate as a staker in the protocol, the smart contract LiquidationPool will first update all stakes that are more than 24 hours old. If the number of stakes is high enough, this can de-incentivize the participation to the protocol, as potential stakers would have to pay elevated gas fees.
A staker could, instead, wait for someone else to increasePosition, decreasePosition or distributeAssets, so that they pay the fees related to updating older stakes: if everyone follows this logic, however, no new stakers will participate in the pool, and already existing ones will not update their position, in order to avoid the costly gas fee.

Vulnerability Details

The function LiquidationPool::increasePosition performs a consolidation of all pending stakes before allowing a new staker to participate in the staking activity. This is shown in the following snippet, that highlights how the consolidation is performed right after the initial check, calling LiquidationPool::consolidatePendingStakes.

function increasePosition(uint256 _tstVal, uint256 _eurosVal) external {
    require(_tstVal > 0 || _eurosVal > 0);
    consolidatePendingStakes(); 
    //the rest of the function...
}

In order to consolidate the pending stakes, the Liquidation Pool uses the following function:

function consolidatePendingStakes() private {
        uint256 deadline = block.timestamp - 1 days;
        for (int256 i = 0; uint256(i) < pendingStakes.length; i++) { 
            PendingStake memory _stake = pendingStakes[uint256(i)];
            if (_stake.createdAt < deadline) {
                positions[_stake.holder].holder = _stake.holder;
                positions[_stake.holder].TST += _stake.TST;
                positions[_stake.holder].EUROs += _stake.EUROs;
                deletePendingStake(uint256(i));
                // pause iterating on loop because there has been a deletion. "next" item has same index
                i--; 
            }
        }
    }

Due to the loop over all elements contained in the pendingStakes array, the more pending stakes have to be resolved, the more gas expensive the overall process will be: this effectively means that, if there are a lot of stakers, who have staked more than 24 hours before the last staker joined the protocol, it can be extremely expensive to join in.
It would make much more sense to wait for someone else to trigger the execution of consolidatePendingStakes (either by increasing or decreasing their existing position, or by calling distributeAssets) before joining, so that the pendingStakes array would be empty.
However, if every staker follows this same reasoning, staking and unstaking activity is effectively de-incentivized, putting the protocol at risk of not having enough liquidity.
The main issue here is that the computational cost of consolidatePendingStakes is quadratic to the length of pendingStakes. This means that, for example:

  • if there are 100 existing pending stakes
  • the gas price is 21
  • adding another staker to the pool would cost 16057929 gas (as per tests run using Hardhat Gas Reporter)

Impact

The protocol could see a complete halt in participation from stakers once enough stakes pass the 24 hour creation mark, as all participants wait for someone else to trigger the execution of consolidatePendingStakes and pay for the costly gas fees.

Tools Used

Manual review, VSCode, Hardhat Gas Reporter

Recommendations

The best approach would be to update the older stakes in other situations, so that their update does not de-incentivize new potential stakers. Also, it would allow to have many more stakers, as any loop over the length of pending stakes is at risk of running into gas exhaustion, given a sufficient number of old (>24 hours) stakes.
Alternatively, as the first recommendation would mean a massive change to the source code, it is possible to rewrite the consolidatePendingStakes and deletePendingStake functions so that their growth is linear with the length of pendingStakes.
Here is an example of how these functions could be re-written in order to grow linearly and allow for many more stakers to join at lower gas prices:

function consolidatePendingStakes() private {
        uint256 deadline = block.timestamp - 1 days;
-        for (int256 i = 0; uint256(i) < pendingStakes.length; i++) { 
-            PendingStake memory _stake = pendingStakes[uint256(i)];
-            if (_stake.createdAt < deadline) {
-                positions[_stake.holder].holder = _stake.holder;
-                positions[_stake.holder].TST += _stake.TST;
-                positions[_stake.holder].EUROs += _stake.EUROs;
-                deletePendingStake(uint256(i));
-                // pause iterating on loop because there has been a deletion. "next" item has same index
-                i--; 
-            }
-        }
+         uint256 deletions = 0;
+         for(uint256 i = pendingStakes.length; i > 0; --i){
+             uint256 index = i - 1;
+             PendingStake memory _stake = pendingStakes[index];
+             if (_stake.createdAt < deadline) {
+                 positions[_stake.holder].holder = _stake.holder;
+                 positions[_stake.holder].TST += _stake.TST;
+                 positions[_stake.holder].EUROs += _stake.EUROs;
+                 deletePendingStake(index, deletions);
+                 ++deletions;
+             }
+         }

+         for(uint256 i = 0; i < deletions; ++i) {
+             pendingStakes.pop();
+         }
    }
     function deletePendingStake(uint256 _i) private {
-         for (uint256 i = _i; i < pendingStakes.length - 1; i++) {
-             pendingStakes[i] = pendingStakes[i+1]; 
-         }
-         pendingStakes.pop();
+        uint256 len = pendingStakes.length;
+        uint256 computation = 1 + deletions;
+        uint256 index = len - computation;
+        PendingStake memory _temp = pendingStakes[_i];
+        pendingStakes[_i] = pendingStakes[index];
+        pendingStakes[index] = _temp;
     }

This solution, when tested with:

  • 100 existing old stakes
  • a new staker wishing to stake
  • Hardhat Gas Reporter analyzing the gas consumption of increasePosition yielded a 63.3% reduction in gas consumption in all worst case scenarios.

Additionally, the proposed solution solves another couple of gas optimizations issues present in the original source code, mainly:

  • the deleting mechanism in deletePendingStake had a O(n) cost, while it now has O(1) cost;
  • int256 i was cast three times for each iteration of the loop. Now it is no longer required;
  • int256 i was decremented, when deletions happened. It is no longer necessary;
  • the for loop in consolidatePendingStakes is using a pre-decrementing, instead of post-incrementing operation, resulting in additional gas savings.

The proposed solution was tested against the existing test suite, successfully clearing all tests.
The basic test I used in Hardhat to measure the new gas consumption with the linear growth was the following one:

describe('increase position numerous times to measure the difference between the current solution and the optimization proposed', async () => {
    tstStakeValue = ethers.utils.parseEther('90000');
    it('Increase position iteratively with new users', async () => {
      for(i = 0; i < 100; i++){
      const wallet = ethers.Wallet.createRandom();
        // Connect the wallet to the network
      const user = wallet.connect(ethers.provider);
        // [user] = await ethers.getSigners();
      const tx = await user1.sendTransaction({
        to: user.address,
        value: ethers.utils.parseEther("1") // sending 0.1 ETH, adjust as needed
      });

      // Wait for the transaction to be mined
      await tx.wait();
      // console.log(user.address);
      await TST.mint(user.address, tstStakeValue);
      await TST.connect(user).approve(LiquidationPool.address, tstStakeValue);
      increase = await LiquidationPool.connect(user).increasePosition(tstStakeValue, 0);
      await expect(increase).not.to.be.reverted;
      }

      const oneDayInSeconds = 86400; 

      // Advance time by one day
      await ethers.provider.send('evm_increaseTime', [oneDayInSeconds]);

      // Mine a new block so that the time change takes effect
      await ethers.provider.send('evm_mine');

      const wallet = ethers.Wallet.createRandom();
        // Connect the wallet to the network
      const user = wallet.connect(ethers.provider);
        // [user] = await ethers.getSigners();
      const tx = await user1.sendTransaction({
        to: user.address,
        value: ethers.utils.parseEther("1") // sending 0.1 ETH, adjust as needed
      });

      // Wait for the transaction to be mined
      await tx.wait();
      // console.log(user.address);
      await TST.mint(user.address, tstStakeValue);
      await TST.connect(user).approve(LiquidationPool.address, tstStakeValue);
      increase = await LiquidationPool.connect(user).increasePosition(tstStakeValue, 0);
      await expect(increase).not.to.be.reverted;


    });
  });

You'll only receive email when they publish something new.

More from emacab98
All posts