Open-end mutual funds offer investors same-day liquidity while holding assets that in some cases take several days to sell. This liquidity transformation creates a potentially destabilizing first-mover advantage: When asset prices fall, investors who exit a fund earlier may pass the liquidation costs generated by their share redemptions to investors who remain in the fund. This incentive becomes particularly acute in periods of market stress, and it can amplify fire-sale spillover losses to other market participants. Swing pricing is a liquidity management tool that targets this first-mover advantage. It allows a fund to adjust or “swing” its net asset value in response to large flows out of or into a mutual fund. This article discusses the industry and regulatory context for swing pricing, and it reviews theory and empirical evidence on the design and effectiveness of swing pricing. The article concludes with directions for further research.
more »
« less
Asset Specificity of Nonfinancial Firms
Abstract We develop a new data set to study asset specificity among nonfinancial firms. Our data cover the liquidation values of each category of assets on firms’ balance sheets and provides information across major industries. First, we find that nonfinancial firms have high asset specificity. For example, the liquidation value of fixed assets is 35% of the net book value in the average industry. Second, we analyze the determinants of asset specificity and document that assets’ physical attributes (e.g., mobility, durability, and customization) play a crucial role. Third, we investigate several implications. Consistent with theories of investment irreversibility, high asset specificity is associated with less disinvestment and stronger effects of uncertainty on investment activities. We also find that the increasing prevalence of intangible assets has not significantly reduced firms’ liquidation values.
more »
« less
- Award ID(s):
- 2144769
- PAR ID:
- 10437710
- Date Published:
- Journal Name:
- The Quarterly Journal of Economics
- Volume:
- 138
- Issue:
- 1
- ISSN:
- 0033-5533
- Page Range / eLocation ID:
- 205 to 264
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
GitGuardian monitored secrets exposure in public GitHub repositories and reported that developers leaked over 12 million secrets (database and other credentials) in 2023, indicating a 113% surge from 2021. Despite the availability of secret detection tools, developers ignore the tools' reported warnings because of false positives (25%−99%). However, each secret protects assets of different values accessible through asset identifiers (a DNS name and a public or private IP address). The asset information for a secret can aid developers in filtering false positives and prioritizing secret removal from the source code. However, existing secret detection tools do not provide the asset information, thus presenting difficulty to developers in filtering secrets only by looking at the secret value or finding the assets manually for each reported secret. The goal of our study is to aid software practitioners in prioritizing secrets removal by providing the assets information protected by the secrets through our novel static analysis tool. We present AssetHarvester, a static analysis tool to detect secret-asset pairs in a repository. Since the location of the asset can be distant from where the secret is defined, we investigated secret-asset co-location patterns and found four patterns. To identify the secret-asset pairs of the four patterns, we utilized three approaches (pattern matching, data flow analysis, and fast-approximation heuristics). We curated a benchmark of 1,791 secret-asset pairs of four database types extracted from 188 public GitHub repositories to evaluate the performance of AssetHarvester. AssetHarvester demonstrates precision of (97%), recall (90 %), and F1-score (94 %) in detecting secret-asset pairs. Our findings indicate that data flow analysis employed in AssetHarvester detects secret-asset pairs with 0 % false positives and aids in improving the recall of secret detection tools. Additionally, AssetHarvester shows 43 % increase in precision for database secret detection compared to existing detection tools through the detection of assets, thus reducing developer's alert fatigue.more » « less
-
A basic assumption of traditional reinforcement learning is that the value of a reward does not change once it is received by an agent. The present work forgoes this assumption and considers the situation where the value of a reward decays proportionally to the time elapsed since it was obtained. Emphasizing the inflection point occurring at the time of payment, we use the term asset to refer to a reward that is currently in the possession of an agent. Adopting this language, we initiate the study of depreciating assets within the framework of infinite-horizon quantitative optimization. In particular, we propose a notion of asset depreciation, inspired by classical exponential discounting, where the value of an asset is scaled by a fixed discount factor at each time step after it is obtained by the agent. We formulate an equational characterization of optimality in this context, establish that optimal values and policies can be computed efficiently, and develop a model-free reinforcement learning approach to obtain optimal policies.more » « less
-
Given a set of securities or assets it is of interest to find an optimal way of investing in these assets. What is optimal has to specified. The objective is to optimize the return consistent with the specified objective. When there are several assets it is unlikely all the assets will increase if they are correlated. It is necessary to diversify one’s assets for a secure return. To deal with the different assets a combination of the assets should be considered with constraints as needed. One approach is the Markowitz mean-variance model where the mean variance is minimized including constraints. In this paper neural networks and machine learning are used to extend the ways of dealing with portfolio asset allocation. Portfolio selection problem in an efficient way. The use of heuristic algorithms in this case is imperative. In the past some heuristic methods based mainly on evolutionary algorithms, tabu search and simulated annealing have been developed. The purpose of this paper is to consider a particular neural network model, the Hopfield network, which has been used to solve some other optimisation problems and apply it here to the portfolio selection problem, comparing the new results to those obtained with previous heuristic algorithms. Although great success has been achieved for portfolio analysis with the birth of Markowitz model, the demand for timely decision making has significantly increased especially in recent years with the advancement of high frequency trading (HFT), which combines powerful computing servers and the fastest Internet connection to trade at extremely high speeds. This demand poses new challenges to portfolio solvers for real-time processing in the face of time-varying parameters. Neural networks, as one of the most powerful machine learning tools has seen great progress in recent years for financial data analysis and signal processing ([1], [14]). Using computational methods, e.g., machine learning and data analytics, to empower conventional finance is becoming a trend widely adopted in leading investment companies ([3]).more » « less
-
To support people at the end of life as they create management plans for their assets, planning approaches like estate planning are increasingly considering data. HCI scholarship has argued that developing more effective planning approaches to support end-of-life data planning is important. However, empirical research is needed to evaluate specific approaches and identify design considerations. To support end-of-life data planning, this paper presents a qualitative study evaluating two approaches to co-designing end-of-life data plans with participants. We find that asset-first inventory-centric approaches, common in material estate planning, may be ineffective when making plans for data. In contrast, heavily facilitated, mission-driven, relationship-centric approaches were more effective. This study expands previous research by validating the importance of starting end-of-life data planning with relationships and values, and highlights collaborative facilitation as a critical part of successful data planning approaches.more » « less
An official website of the United States government

