Tarun Chitra, Gauntlet Team

September 13, 2023


Key Takeaways


Many DeFi users are familiar with the differences between Automated Market Makers (AMM) and more traditional Central Limit Order Book (CLOB) exchanges. The pros and cons of both trading mechanisms have been studied and discussed extensively by various researchers. In general terms, the two archetypes of trading protocols can be characterized by the control that users have over pricing.

  • AMMs – traders specify only the quantity, the price is algorithmic
  • CLOBs – traders specify the price and quantity 

While pure on-chain CLOBs are rare, most protocols fall somewhere on this spectrum. For example, AMMs with advanced concentrated liquidity features can be seen as a type of hybrid design. The tradeoffs of CLOBs versus AMMs are also well-known. Most visibly, AMMs are suitable for passive liquidity provision (LP) strategies, which are not possible on CLOB exchanges. On the other hand, CLOBs provide more opportunities for sophisticated active LPs, who generally drive price discovery across crypto markets. This article aims to develop a similar framework for the DeFi lending space, where a number of new designs have recently emerged. We identify the pros and cons of various approaches and provide an intro for upcoming technical research.

The Lending Universe

Lending has some notable differences from trading. Besides price and quantity, lending involves matching users’ duration and collateral preferences, which requires more variables. Since lending occurs over time, arbitrage in lending markets is also more complex and costly than in AMMs, which significantly affects price discovery and efficiency.

Despite the differences, many of the same concepts still apply. Borrowers and lenders generally have full control over the size of their trades, and a varying degree of control over other lending parameters, such as interest rates, liquidation prices, and collateral types. We can consider this as a higher-dimensional version of the AMM example, with multiple types of prices as opposed to just one. For the remainder of this article, we will consider a hypothetical lending market with three loan variables, as shown in the diagram below.

The most prevalent DeFi lending model, used by protocols such as Aave and Compound, determines prices algorithmically. Users can choose how much to supply or borrow but have limited control over other parameters. For example, lenders cannot typically set conditions on interest rates, liquidation prices, or collateral types used by borrowers. To maintain stability and efficiency, these types of protocols require active governance to adjust a wide range of parameters. 

User-Defined Parameters

Newer lending protocols like Morpho and Ajna allow users more flexibility in defining price, while requiring more frequent interaction and monitoring. These designs may have some advantages if the benefits of added flexibility outweigh the added costs. Focusing on the interest rate and liquidation price, we can characterize existing protocols by how they approach each variable, as shown below.

Each approach has tradeoffs for functionality, efficiency, complexity, and risk. For example, users of highly algorithmic protocols may only need to monitor a few variables, such as their health factor, to maintain a long-term position with acceptable risk and efficiency. On the other hand, protocols with more manual pricing may achieve higher efficiency, but require more complex actions from users. Over time, protocols will likely emerge across all parts of the spectrum, experimenting with different levels of automation in each variable.

Eventually, we may see designs that find innovative ways to combine approaches and test the center of the design space. For example, it may be possible to let users choose between different algorithmic and user-defined modes within the same lending market. Due to the wide range of user and asset types in DeFi, we see hybrid protocols as a potentially promising direction to optimize for mass applications. However, due to the relatively early stage of market development, these more ambitious hybrid designs have mostly yet to be explored.


So far, we have considered the levers of interest rates and liquidation prices, but lending also depends heavily on collateral types. The degree of control that users can maintain over collateral types depends on the protocol’s approach to pooling. In a peer-to-peer framework with no pooling, lenders would be able to identify the specific collateral assets provided by their counterparty. However, most lending protocols include some level of peer-to-pool structure, where borrowers and lenders interact with a pool of many counterparties and possibly many assets. Existing protocols have a variety of approaches to multi-asset pooling, as shown in the diagram below.

The range of assets included in a pool has major effects on the market structure. If many collateral assets are included, the pool faces a greater risk of being overexposed to low-quality collateral. If many borrowable assets are included, the pool faces a greater risk of manipulation or excessive volatility originating from one of the borrowed assets. Some protocols may choose to deploy several pools, with more isolation for higher-risk assets and multi-asset pools for safer assets.

Collateral quality is not constant through time, however, and assets that were at one point high-quality collateral may become illiquid or highly volatile. In a multi-asset pool, periodic reviews are needed to ensure that listed assets do not add unacceptable risk over time. It is also impossible to assess collateral quality entirely on-chain, since it depends on volume and liquidity on centralized platforms, which are not directly observable in the on-chain data. Multi-asset pooling strategies typically require active management from governance to maintain protocol stability, which is a major focus of Gauntlet’s risk management work for lending protocols.  

In contrast, pools with only one collateral asset and one borrowable asset can function with much less governance supervision, but provide a more limited range of use cases. Generally, we will observe similar tradeoffs throughout this discussion of the lending space. Protocols that provide more functionality for users typically involve more risk, lower efficiency, or higher complexity.

Why Tradeoffs?

The goal of any marketplace is to match users with a counterparty that takes the other side of the trade at a mutually agreeable price. In our hypothetical lending market, we can consider the distribution of users by the interest rate, liquidation price, collateral type, and size in which they are open to transact. In general, this is a very complex multidimensional distribution, but for simplicity, we can look at a version with only one price variable, as shown below.

Generally, users want the ability to express their preferences with high granularity across a wide range of assets and parameters, which we call functionality. In the diagram, this corresponds to the ability to choose a position within the distribution with high precision across many parameters. Also, users typically want their preferences to be matched as favorably as possible, which we call efficiency. This is shown in the diagram as a well-defined matching range, which only matches users within their preferred price limits. In a less efficient design, users may not always be able to match with the most favorable counterparty. 

Protocols with high functionality and efficiency usually also have higher complexity, which requires more active decision-making and thus higher costs related to governance or position management. Finally, improvements in functionality or efficiency may increase the risk faced by users, due to the greater vulnerability of complex matching systems to manipulation, volatility, or low-quality collateral. The balance of functionality, efficiency, complexity, and risk is central to the overall structure of the lending space.

In the earlier discussion of automation and pooling, we covered some of the ways that protocols can provide more or less functionality. In the rest of this article, we will look at how the different approaches affect efficiency and complexity, and how all these factors relate to risk.


The level of automation used in a protocol has a significant impact on efficiency, which we can break down into two components:

  • Matching – whether the prices realized by users reflect the best possible matches
  • Price Discovery – whether the prices available for matching reflect fair value in the broader market

There is also a third component of efficiency that is directly related to risk, but we will not focus on it here. If a protocol takes more risk at the systemic level, users can be matched more efficiently due to more of their risk being mutualized by the protocol. Since we have covered that risk-efficiency tradeoff at length elsewhere, this article will instead focus on the components of efficiency that are related to matching and price discovery. However, for all of the designs discussed, it is important to keep in mind that efficiency can be improved with a higher tolerance for mutualized insolvency risk.


Efficient matching should be sensitive to supply and demand and adjust prices accordingly. The rate of change of prices relative to supply and demand is known as price elasticity. In an elastic market, for example, a small withdrawal of lenders would lead to a smooth, proportionate rise in interest rates.

Elasticity can also be seen as a measure of the density of supply and demand around a given price. When plenty of borrowers and lenders are willing to transact at nearby price levels, markets are elastic, and prices move continuously and smoothly. When marginal supply and demand are sparse, prices react abruptly and disproportionately, which may be called an inelastic market.

In lending protocols that use highly-automated matching mechanisms, the algorithmic elasticity should closely reflect actual user preferences to achieve efficient matching. For example, many protocols define interest rates as a function of utilization, which is the fraction of an asset’s total supply in the lending pool that has been borrowed. The most common algorithmic interest rate design, known as the jump rate model, is a relatively simple function of utilization, as shown below.

Converting this to a function of supply and demand, we get the gradient chart on the right showing the dependence on each variable, which is effectively the elasticity of the interest rate algorithm. In general, it is sensible for elasticity to look like this. When borrow is low relative to supply, the interest rate changes with a relatively low constant slope as supply and demand vary. When close to 100% of the supply is borrowed, the risk of a liquidity crunch in the pool is higher, so it is reasonable for interest rates to move more aggressively.

Though the jump rate model is a good first-order approximation, it is not realistic to expect actual users to have a distribution of supply and demand preferences that match the model exactly. In some cases, the model may overestimate or underestimate users’ reactions to changes in the market balance, which results in suboptimal efficiency. One of the potential advantages of user-defined interest rates is that users may be able to avoid some of this inefficiency with frequent rebalancing. The exact tradeoff depends on user objective functions and the cost of position management, which we will describe quantitatively in an upcoming technical paper.

More complex algorithmic designs such as PID-controlled interest rates may also have a role in lending markets of the future. These dynamic algorithms adjust over time to reflect observed supply and demand, aiming to more accurately estimate user elasticity and achieve more efficient matching. In contrast to user-defined models, which have not seen significant adoption yet for interest rates, dynamic algorithms do not require users to continuously adjust pricing to achieve high efficiency. However, PID controllers can be vulnerable to manipulation (see examples here) and may still require close monitoring from users to avoid excess risk. Generally, a robust arbitrage market would make manipulation more costly and unlikely to succeed, which may improve the risk-efficiency tradeoffs of more complex algorithms. This highlights the importance of interest rate discovery to the overall development of lending markets, which we turn to next.

Interest Rate Discovery

Currently, the largest active lending protocols use algorithmic interest rates, which set the benchmark for the rest of the market. This is a significant contrast to trading venues, where algorithmic AMM pricing usually lags behind the user-defined pricing of CLOBs. However, the dominance of the largest protocols is likely most significant in the major assets, and not as strong in smaller markets where idiosyncratic asset behavior needs to be monitored more closely. 

Price discovery is an important component of efficiency due to the additional costs faced by users of peripheral markets. These costs are the arbitrage profits of traders who align prices with the benchmark, and are experienced as added volatility and lower risk-adjusted return by the rest of the users. To become the drivers of interest rate discovery, newer protocols would need to attract arbitrage traders that rely on them as a benchmark. Generally, arbitrage activity tends to center around a benchmark pool with deep liquidity and predictable rates, as shown in the diagram below. 

When it comes to predictability, a basic algorithmic rate may have an advantage, as it is easier to calculate linear functions than to predict a complex market balance of many user-defined variables. This may be one reason for the jump rate model’s continued leadership of interest rate discovery in major markets. In smaller markets, the advantage may be less relevant as interest rates are more volatile and unpredictable and the jump rate model is generally less accurate.

Given the early state of development in user-defined interest rates and lending arbitrage currently, it would likely take significant advancements in market infrastructure for these factors to have their full effect. However, if future developments reduce the costs and risks involved, liquidity should spread efficiently across lending protocols of all types via a robust arbitrage market. Protocols that are able to position themselves as centers of arbitrage activity will likely benefit from greater efficiency and come to occupy leading roles in their ecosystems.

Liquidation Price Discovery

More user control over pricing may also allow protocols to develop self-contained liquidation mechanisms. In oracle-free protocols, internal pricing entirely replaces external oracle feeds in determining when loans are subject to liquidation. While these designs avoid the risks of external dependencies, it is important to note that they are not immune to manipulation and still subject to the tradeoffs discussed earlier. 

Users who lend or borrow through an oracle-free protocol need to closely monitor the internal pricing dynamics and frequently adjust their positions in response to events. Depending on the specific design, internal pricing mechanisms may be vulnerable to price or interest rate manipulation, which reduces efficiency for non-adversarial users. These mechanisms can be very complex and require careful risk assessment in the pre-launch phase to identify potential vulnerabilities and develop mitigation strategies.

Since asset price discovery currently occurs elsewhere, it is also likely that liquidation prices on oracle-free protocols will not reflect the fair market value on major exchanges. Until a robust arbitrage market can be established, users may achieve lower efficiency compared to an oracle-connected protocol, due to the less predictable pricing of internal mechanisms. However, because liquidation occurs instantaneously as opposed to gradually, a robust oracle-free liquidation market may be easier to establish than the interest rate arbitrage market described in the previous section.

In general, oracle-free mechanics can provide efficiencies for high-touch users who are able to monitor and react skillfully to market dynamics. Since these protocols combine the functions of lending and asset price discovery, which occur on different time horizons, they require a high level of user sophistication to navigate the complexity and achieve the potential efficiencies. 


Some protocol designs aim to achieve efficiency and functionality through complex matching systems with a high degree of user control. Like all lending protocols, these designs involve some overhead costs to maintain an acceptable level of risk and efficiency. Complexity generally leads to higher overhead costs in some form, either directly to users or indirectly via more costly governance, as more accuracy is needed to maintain the desired efficiency at a tolerable level of risk. To complete a holistic review of the tradeoffs, we must now describe how these overhead costs fit into the equation.

Position Management

The most direct overhead cost to users in a lending protocol is position management. Every time a user-defined parameter is changed, there is an opportunity cost to the process and possible transaction costs related to executing the change. Ideally, the efficiency gains are enough to compensate, but if position updates are very frequent or costly, they may erode much of the theoretical benefit. We can quantify this by considering the frequency of interactions with the protocol needed to maintain the desired efficiency and the cost per interaction. 

In the example below, the user adjusts their position many times in response to changes in fair value. While the user does obtain better pricing manually than using the algorithmic price, it is not certain that this will translate into higher realized efficiency after including all the related costs.

One proposed solution is to outsource position management to specialized third parties, who could design a range of strategies and operate them as a service. Users could then select a strategy that best fits their objectives and deploy it without much further monitoring. Still, this is not cost-free, as users would need to pay service providers competitively and periodically assess their performance. Effectively, users would be replacing the immediate direct costs of managing their own positions with the ongoing costs of retaining a service provider.

However, if position management services can scale efficiently, the cost per adjustment will eventually come down. In this scenario, users would obtain high-frequency position updates and high efficiency without prohibitive cost. This would make it possible for protocols with many user-defined parameters to reach users who currently prefer the convenience of more algorithmic designs. While cheap and effective strategies have yet to be proven at scale, we expect service providers will deliver many innovative products if the lending space shifts significantly in this direction.

Finally, we can consider another approach to position management, which is to use more automation and control more parameters through governance. This has been the approach taken by many existing protocols and has proven fairly successful in managing the popular lending designs so far. However, it is also not cost-free and carries its own tradeoffs.


For protocols that choose to manage parameters at the governance level, the objective is to simplify the user experience while remaining as efficient as possible. Good decision-making from governance can be an attractive feature for users, who can reduce their individual monitoring and position management costs. 

The tradeoff is that governance becomes very complex and requires specialized analysis to make informed decisions, increasing the cost of maintaining efficient parameters. While these costs may not be immediately apparent to users, they are eventually realized either through a decline in efficiency or increased protocol fees to fund the needed resources.

With greater complexity, governance is also increasingly slower to respond than users making adjustments in real-time, due to the time needed to evaluate proposals, coordinate a vote, and implement a change once approved. In some cases, this may result in governance failing to agree on a sustainable path forward or delaying action due to debate around controversial proposals (see example here). Gauntlet’s risk management work so far has mainly focused on addressing these issues by providing tooling and analysis to inform governance decisions.

As designs become more complex, protocols may seek to reduce governance overhead by whitelisting service providers to perform certain routine changes without a full governance process. Gauntlet has recently participated in a Steward role with Aave and Moonwell, where we have adjusted supply and borrow caps through this type of mechanism. Whitelisted governance roles are similar in risks and benefits to the third-party position management discussed in the previous section. Through different paths, these two solutions arrive at a similar structure, and may eventually converge through further iterations. This brings us to the conclusion of this review, where we propose a view of lending development moving forward.


Lending designs exist on a spectrum of user control they allow over various parameters, such as interest rates, liquidation prices, and collateral types. While it may seem that design philosophies are diverging, the greatest opportunities are likely toward the center of the design space and not the extremes. Due to the great variety of assets and user types in DeFi, the optimal level of user control or automation in any given lending parameter is likely not all or nothing, but somewhere in between.

We can formalize this conclusion by considering the overall utility a lending protocol provides to users. Similar to how existing protocols optimize parameter settings to maximize utility (see here), this can be seen as a meta-optimization of how parameters should work in the first place. Including all the dimensions discussed earlier, we arrive at the equation below, where f,g,h,i are the user utility functions with respect to each variable:

Regardless of how a protocol chooses to manage parameters on an ongoing basis, the higher-level problem requires careful consideration to find a sustainable balance. Since improving one of these variables almost always involves a compromise, designs that over-optimize in one direction limit their appeal to users who heavily skew their utility function in the same way, which is likely a small group. On the other hand, protocols that find a middle ground that better balances these factors for a wide range of users can probably reach a much larger market. 

Eventually, we expect that lending designs will move toward a moderate approach. Most protocols will likely converge around some mix of pooling, algorithms, user control, governance, and specialized management that best solves the tradeoffs for unopinionated users. In the meantime, the paths taken may differ significantly, and the next few years may see a wide range of designs in active competition. As a service provider, Gauntlet will continue providing high-quality insights to as many customers as possible, wherever they are in the lending design space. As the market changes, we look forward to adapting and extending our products to inform the new decisions that protocols and users will face along the way.


View the full presentation

Read the full paper

Want Gauntlet in

your inbox?

Sign up to get notified about our latest research.

Thank you. You'll hear from us soon.

Contact our team

Tell us about your protocol’s needs

1/4 Name

First, tell us your name

2/4 Contact Info

Tell us know to reach you.

Contact method

Address must be correctly formatted

3/4 Protocol Info

Tell us about your protocol.

Protocol type

4/4 Details

Just one more thing...


Thank you! You'll hear from us soon.