skip to main content


Title: An Axiomatic Study of Scoring Rule Markets
Prediction markets are well-studied in the case where predictions are probabilities or expectations of future random variables. In 2008, Lambert, et al. proposed a generalization, which we call ``scoring rule markets'' (SRMs), in which traders predict the value of arbitrary statistics of the random variables, provided these statistics can be elicited by a scoring rule. Surprisingly, despite active recent work on prediction markets, there has not yet been any investigation into more general SRMs. To initiate such a study, we ask the following question: in what sense are SRMs ``markets''? We classify SRMs according to several axioms that capture potentially desirable qualities of a market, such as the ability to freely exchange goods (contracts) for money. Not all SRMs satisfy our axioms: once a contract is purchased in any market for prediction the median of some variable, there will not necessarily be any way to sell that contract back, even in a very weak sense. Our main result is a characterization showing that slight generalizations of cost-function-based markets are the only markets to satisfy all of our axioms for finite-outcome random variables. Nonetheless, we find that several SRMs satisfy weaker versions of our axioms, including a novel share-based market mechanism for ratios of expected values.  more » « less
Award ID(s):
1657598
NSF-PAR ID:
10057897
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Innovations in Theoretical Computer Science
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    We consider the design of private prediction markets , financial markets designed to elicit predictions about uncertain events without revealing too much information about market participants’ actions or beliefs. Our goal is to design market mechanisms in which participants’ trades or wagers influence the market’s behavior in a way that leads to accurate predictions, yet no single participant has too much influence over what others are able to observe. We study the possibilities and limitations of such mechanisms using tools from differential privacy. We begin by designing a private one-shot wagering mechanism in which bettors specify a belief about the likelihood of a future event and a corresponding monetary wager. Wagers are redistributed among bettors in a way that more highly rewards those with accurate predictions. We provide a class of wagering mechanisms that are guaranteed to satisfy truthfulness, budget balance on expectation, and other desirable properties while additionally guaranteeing ε-joint differential privacy in the bettors’ reported beliefs, and analyze the trade-off between the achievable level of privacy and the sensitivity of a bettor’s payment to her own report. We then ask whether it is possible to obtain privacy in dynamic prediction markets, focusing our attention on the popular cost-function framework in which securities with payments linked to future events are bought and sold by an automated market maker. We show that under general conditions, it is impossible for such a market maker to simultaneously achieve bounded worst-case loss and ε-differential privacy without allowing the privacy guarantee to degrade extremely quickly as the number of trades grows (at least logarithmically in number of trades), making such markets impractical in settings in which privacy is valued. We conclude by suggesting several avenues for potentially circumventing this lower bound. 
    more » « less
  2. The internet advertising market is a multibillion dollar industry in which advertisers buy thousands of ad placements every day by repeatedly participating in auctions. An important and ubiquitous feature of these auctions is the presence of campaign budgets, which specify the maximum amount the advertisers are willing to pay over a specified time period. In this paper, we present a new model to study the equilibrium bidding strategies in standard auctions, a large class of auctions that includes first and second price auctions, for advertisers who satisfy budget constraints on average. Our model dispenses with the common yet unrealistic assumption that advertisers’ values are independent and instead assumes a contextual model in which advertisers determine their values using a common feature vector. We show the existence of a natural value pacing–based Bayes–Nash equilibrium under very mild assumptions. Furthermore, we prove a revenue equivalence showing that all standard auctions yield the same revenue even in the presence of budget constraints. Leveraging this equivalence, we prove price of anarchy bounds for liquid welfare and structural properties of pacing-based equilibria that hold for all standard auctions. In recent years, the internet advertising market has adopted first price auctions as the preferred paradigm for selling advertising slots. Our work, thus, takes an important step toward understanding the implications of the shift to first price auctions in internet advertising markets by studying how the choice of the selling mechanism impacts revenues, welfare, and advertisers’ bidding strategies. This paper was accepted by Itai Ashlagi, revenue management and market analytics. Supplemental Material: The online appendix is available at https://doi.org/10.1287/mnsc.2023.4719 . 
    more » « less
  3. Electricity markets are cleared by a two-stage, sequential process consisting of a forward (day-ahead) market and a spot (real-time) market. While their design goal is to achieve efficiency, the lack of sufficient competition introduces many opportunities for price manipulation. To discourage this phenomenon, some Independent System Operators (ISOs) mandate generators to submit (approximately) truthful bids in the day-ahead market. However, without fully accounting for all participants' incentives (generators and loads), the application of such a mandate may lead to unintended consequences. In this paper, we model and study the interactions of generators and inelastic loads in a two-stage settlement where generators are required to bid truthfully in the day-ahead market. We show that such mandate, when accounting for generator and load incentives, leads to a {generalized} Stackelberg-Nash game where load decisions (leaders) are performed in day-ahead market and generator decisions (followers) are relegated to the real-time market. Furthermore, the use of conventional supply function bidding for generators in real-time, does not guarantee the existence of a Nash equilibrium. This motivates the use of intercept bidding, as an alternative bidding mechanism for generators in the real-time market. An equilibrium analysis in this setting, leads to a closed-form solution that unveils several insights. Particularly, it shows that, unlike standard two-stage markets, loads are the winners of the competition in the sense that their aggregate payments are less than that of the competitive equilibrium. Moreover, heterogeneity in generators cost has the unintended effect of mitigating loads market power. Numerical studies validate and further illustrate these insights. 
    more » « less
  4. his work investigates the potential of using aggregate controllable loads and energy storage systems from multiple heterogeneous feeders to jointly optimize a utility's energy procurement cost from the real-time market and their revenue from ancillary service markets. Toward this, we formulate an optimization problem that co-optimizes real-time and energy reserve markets based on real-time and ancillary service market prices, along with available solar power, storage and demand data from each of the feeders within a single distribution network. The optimization, which includes all network system constraints, provides real/reactive power and energy storage set-points for each feeder as well as a schedule for the aggregate system's participation in the two types of markets. We evaluate the performance of our algorithm using several trace-driven simulations based on a real-world circuit of a New Jersey utility. The results demonstrate that active participation through controllable loads and storage significantly reduces the utility's net costs, i.e., real-time energy procurement costs minus ancillary market revenues. 
    more » « less
  5. Power grids are evolving at an unprecedented pace due to the rapid growth of distributed energy resources (DER) in communities. These resources are very different from traditional power sources as they are located closer to loads and thus can significantly reduce transmission losses and carbon emissions. However, their intermittent and variable nature often results in spikes in the overall demand on distribution system operators (DSO). To manage these challenges, there has been a surge of interest in building decentralized control schemes, where a pool of DERs combined with energy storage devices can exchange energy locally to smooth fluctuations in net demand. Building a decentralized market for transactive microgrids is challenging because even though a decentralized system provides resilience, it also must satisfy requirements like privacy, efficiency, safety, and security, which are often in conflict with each other. As such, existing implementations of decentralized markets often focus on resilience and safety but compromise on privacy. In this paper, we describe our platform, called TRANSAX, which enables participants to trade in an energy futures market, which improves efficiency by finding feasible matches for energy trades, enabling DSOs to plan their energy needs better. TRANSAX provides privacy to participants by anonymizing their trading activity using a distributed mixing service, while also enforcing constraints that limit trading activity based on safety requirements, such as keeping planned energy flow below line capacity. We show that TRANSAX can satisfy the seemingly conflicting requirements of efficiency, safety, and privacy. We also provide an analysis of how much trading efficiency is lost. Trading efficiency is improved through the problem formulation which accounts for temporal flexibility, and system efficiency is improved using a hybrid-solver architecture. Finally, we describe a testbed to run experiments and demonstrate its performance using simulation results. 
    more » « less