Survey of Automated Market Making Algorithms
Posted dhcn
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Survey of Automated Market Making Algorithms相关的知识,希望对你有一定的参考价值。
https://medium.com/terra-money/survey-of-automated-market-making-algorithms-951f91ce727a
Automated market makers are routinely adopted in a number of contexts, from financial to betting markets and have recently been proposed as an efficient way to set prices in the cryptocurrency space. Their role is fundamental and finding the right protocol is key to the successful launch of any new market.
There are many challenges in finding the right algorithm, as it should be compatible with the specific characteristics of the market, such as its liquidity, and its features should minimize the scope for arbitrage opportunities and erratic price behavior. In our journey towards a robust Terra/Luna on-chain market, we have reviewed several automated market making algorithms. In this post, we are going to share some of our learning and discuss the main features behind the Logarithmic Market Scoring Rule and the Uniswap model, as these are the two most popular market making algorithms. Below, we walk through the mechanics of each approach, and discuss the potential merits and drawbacks. We also briefly discuss applicability of these approaches to Terra’s on-chain swap mechanism.
Logarithmic Market Scoring Rule (“LMSR”)
The LMSR is an automated market maker used in prediction markets. To illustrate how it works, consider wagering on an event where exactly one of two outcomes must occur. For example, two outcomes in the sports betting market could be (1) the New England Patriots win the Super Bowl and (2) the New England Patriots lose the Super Bowl. Traders are allowed to buy shares which each pay them $1 if the relevant outcome occurs. This means that if a trader bought 3 shares of “Patriots lose,” and the Patriots do indeed lose the Super Bowl, he/she will win $3.
One of the market maker’s main responsibilities is to decide on the prices to charge potential traders. Logically, the price should be correlated with the probabilities of each outcome — e.g., if “Patriots win” is a very likely outcome, then that would mean that there is a high chance that the market maker will have to make payments to the holders of “Patriots win” shares after the event. To offset this, the market maker will need to charge higher amounts for those shares. With that principle established, how does the market maker decide on a sensible probability assessment? The buying and selling it observes are key inputs, as lots of traders buying shares of “Patriots win” would be a sign that the market perceives the price as too low relative to the probability of the outcome. To formally incorporate the information it gets from buyers and sellers, the market maker can use a cost function, C (PW, PL). C is defined by the following formula, with b being a number greater than 0 chosen by the market maker (more on that later):
This somewhat complicated looking function maps the shares outstanding to dollar amounts that the market maker must receive in order to accept those bets. To be more precise — the price for each incremental bet can be calculated by looking at how C will change. One critical feature of this equation (and why this is used instead of a simpler formula) is that the price for incremental bets will change based on the numbers of shares outstanding in each category, thereby incorporating the signals sent by buying and selling in the market. Besides the observed buying and selling, b is the other input. It is called the liquidity parameter, and it controls how much marginal prices change in response to trades. Prices change quickly when b is small and slowly when b is large.
Now that we’ve established some of the basics, let’s use concrete numbers to make the mechanics a bit more tangible. Assume that the market maker sets b = 10, and no shares of PW or PL have been sold so far. A trader asks for the price of PW. As mentioned above — the price for a single, or incremental trade, is captured by the change in C. As such, the market maker will calculate the difference in C from moving from the initial state where nothing has been to sold, C (0,0), to the state where one share of PW has been sold, C(1,0). In numbers:
For the trader, he must consider the probability of a Patriots victory in determining if he wants to buy this from the market maker. For example, if the trader thinks the Patriots have a 60% chance of victory, then his expected profit is composed of a 60% chance he wins $0.49 (the $1.00 he receives minus the $0.51 he pays), and a 40% chance he loses $0.51. The net expectation is 0.6 * (1–0.51) + 0.4 * (-0.51), or a profit of $0.09. As illustrated by this example, traders in this type of market will be incentivized to buy shares as long as the price implies a probability that is lower than his/her own beliefs. This is an important feature: traders have an incentive to be truthful and trade according to their beliefs.
Now let’s assume that some time passes, and at this point the market maker has sold 5 shares of PW and 3 shares of PL. A new trader comes to him looking to buy 1 share of PW. The automated market maker will determine the price by calculating C(6, 3) — C(5, 3):
Note that in this second transaction, the price for PW has increased. This is a reflection of the fact that more traders have bought PW than PL. With LMSR, prices move to reflect the beliefs of the market.
A key drawback of this algorithm is the discretion around the choice of the parameter b. What is a good value for b? With a small value for b, the prices will change much more rapidly (i.e. purchasing a small number of shares increases the price a lot). Conversely, a large value for b will make the prices feel more “sticky,” requiring a large number of shares to change the price significantly. Choosing a good value for b is largely dependent on the nature of the market. The algorithm might be modified to be able to adapt to liquidity, i.e. how many investors are willing to participate in the market, by using a variable b(q) that increases with market volume as it has been proposed here.
Uniswap/Blockdrop
Another algorithm for setting prices has been recently proposed by Uniswap and has quickly attracted the attention of the crypto community. Typically, an exchange has an order book comprised of limit orders, with buy and sell orders of various quantities and prices. Traders who come to an exchange look through these orders to find quantities and prices which match their needs. For example, a trader looking to buy 25 shares at the exchange below would purchase 10 shares at a price of 10.1 and 15 shares at a price of 10.2.
In contrast to the traditional model, the Uniswap/Blockdrop model holds an orderbook where all liquidity provision trades are pooled together, and trades are priced according to a “constant product” mechanism. In this system, the liquidity available on both sides of the potential trade are multiplied together, and that product is held constant and used to determine the price of all transactions. Below, we provide an example to better explain how this works.
We begin with a liquidity provider depositing 10 BTC and 500 ETH into the exchange. The product, 10 * 500 = 5,000, becomes the Product we use for calculating the exchange rate of future trades.
Now we assume that a buyer wants to use 1 BTC to buy ETH. The pool of BTC increases from 10 to 11. Our Product is fixed at 5,000, so we divide 5,000 / 11 to calculate the new pool of ETH of 454.5.
In return for adding 1 BTC to the pool, the buyer receives an amount of ETH equal to the change in ETH, or 500–454.5 = 45.5 ETH. Given that exchange rates are driven by the Product, this means that the next trade will receive slightly different rates. Also, one can introduce a small trading fee to compensate liquidity providers.
In our example, the buyer used 1 BTC in his purchase. If he had purchased a different amount, the exchange rate he would have received would have also been different, as the exchange rate reflects how much a trade shifts the “x” to “y” ratio. See the chart below for a graphical depiction of this relationship. As the size of the purchase increases, the exchange rate becomes less favorable. In this way, the end result ultimately bears some similarity to traditional markets where market participants see the market moving against them as they trade more aggressively.
Intuitively, in traditional markets, prices and liquidity are more strongly affected by more extreme order imbalances, i.e. one-sided interest in an asset, for two reasons. First, even a random large order imbalance exacerbates the inventory problem faced by the market maker, e.g. after a sell-off by investors, the market maker needs to be compensated for holding these assets for very long and so for bearing the risk of any fluctuation in value of these assets. The market maker would then be expected to respond by changing bid-ask spreads and revising price quotations. Second, order imbalances sometimes signal private information, which should reduce liquidity at least temporarily and could also move the market price permanently, e.g. the market maker and the other investors might believe the sell-off reflects news about the future cash flow of the assets. Note that in our initial example pertaining to traditional markets the trader would have preferred to buy all his shares at the cheaper price of 10.1, but there was not enough selling interest at that level, and he had to resort to a less favorable price.
Overall, we have found Uniswap to be a more applicable model to the Terra/Luna on-chain market relative to LSMR. Although very efficient and simple, the LSMR seems more adequate to prediction markets, and it leaves too much discretion in the determination of the liquidity parameter b, which would be difficult to calibrate in a potential crypto application, where liquidity is likely to dramatically change over time. One of the key apt features of the Uniswaps model, instead, the way in which the exchange rate becomes less favorable as more liquidity is demanded in the market, can be emulated by introducing a spread function that depends on the amount of liquidity in the marker. This would avoid introducing some of the additional features that would be more cumbersome to fine-tune to the case of Luna, e.g. introducing rewards for those providing liquidity. We are actively exploring an application of the Uniswap model to the next iteration of Columbus, which you can read about here: https://agora.terra.money/t/oracle-revamp-proposal-for-columbus-3/84
以上是关于Survey of Automated Market Making Algorithms的主要内容,如果未能解决你的问题,请参考以下文章
A Survey of Long-Term Context in Transformers
A Survey of Image Encryption Algorithms