skip to main content


Title: Market mechanism to enable grid-aware dispatch of Aggregators in radial distribution networks
This paper presents a market-based optimization framework wherein Aggregators can compete for nodal capacity across a distribution feeder and guarantee that allocated flexible capacity cannot cause overloads or congestion. This mechanism, thus, allows Aggregators with allocated capacity to pursue a number of services at the whole-sale market level to maximize revenue of flexible resources. Based on Aggregator bids of capacity (MW) and network access price ($/MW), the distribution system operator (DSO) formulates an optimization problem that prioritizes capacity to the different Aggregators across the network while implicitly considering AC network constraints. This grid-aware allocation is obtained by incorporating a con- vex inner approximation into the optimization framework that prioritizes hosting capacity to different Aggregators. We adapt concepts from transmission-level capacity market clearing, utility demand charges, and Internet-like bandwidth allocation rules to distribution system operations by incorporating nodal voltage and transformer constraints into the optimization framework. Simulation based results on IEEE distribution networks showcase the effectiveness of the approach.  more » « less
Award ID(s):
2047306
NSF-PAR ID:
10397906
Author(s) / Creator(s):
;
Date Published:
Journal Name:
11TH BULK POWER SYSTEMS DYNAMICS AND CONTROL SYMPOSIUM
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This paper considers optimization problems of energy demand networks including aggregators and investigates strategic behavior of the aggregators. The participants of the network are a utility company, who plays a role of energy supply source, aggregators and a large number of consumers. We suppose that the network will be optimized by price response based or, in other words, market based optimization processes. We also suppose that the aggregator has a strategic parameter in its cost function and, by choosing the parameter strategically, the aggregator will try to pursue its own benefit. This general problem formulation will apply to a specific problem setting, where the aggregator possess battery storage with different specifications: The one is high-performance and expensive and the other is low-performance and cheap. The aggregator will choose total capacity of storage to be installed and a ratio of high-performance storage to low-performance storage as the strategic parameters and try to increase its own benefit. By using numerical examples, we show that the strategic decision making by the aggregator could provide useful insights in qualitative analysis of energy demand networks. 
    more » « less
  2. ABSTRACT

    The presented methodology results in an optimal portfolio of resilience‐oriented resource allocation under weather‐related risks. The pre‐event mitigations improve the capacity of the transportation system to absorb shocks from future natural hazards, contributing to risk reduction. The post‐event recovery planning results in enhancing the system's ability to bounce back rapidly, promoting network resilience. Considering the complex nature of the problem due to uncertainty of hazards, and the impact of the pre‐event decisions on post‐event planning, this study formulates a nonlinear two‐stage stochastic programming (NTSSP) model, with the objective of minimizing the direct construction investment and indirect costs in both pre‐event mitigation and post‐event recovery stages. In the model, the first stage prioritizes a bridge group that will be retrofitted or repaired to improve the system's robustness and redundancy. The second stage elaborates the uncertain occurrence of a type of natural hazard with any potential intensity at any possible network location. The damaged state of the network is dependent on decisions made on first‐stage mitigation efforts. While there has been research addressing the optimization of pre‐event or post‐event efforts, the number of studies addressing two stages in the same framework is limited. Even such studies are limited in their application due to the consideration of small networks with a limited number of assets. The NTSSP model addresses this gap and builds a large‐scale data‐driven simulation environment. To effectively solve the NTSSP model, a hybrid heuristic method of evolution strategy with high‐performance parallel computing is applied, through which the evolutionary process is accelerated, and the computing time is reduced as a result. The NTSSP model is implemented in a test‐bed transportation network in Iowa under flood hazards. The results show that the NTSSP model balances the economy and efficiency on risk mitigation within the budgetary investment while constantly providing a resilient system during the full two‐stage course.

     
    more » « less
  3. This Article develops a framework for both assessing and designing content moderation systems consistent with public values. It argues that moderation should not be understood as a single function, but as a set of subfunctions common to all content governance regimes. By identifying the particular values implicated by each of these subfunctions, it explores the appropriate ways the constituent tasks might best be allocated-specifically to which actors (public or private, human or technological) they might be assigned, and what constraints or processes might be required in their performance. This analysis can facilitate the evaluation and design of content moderation systems to ensure the capacity and competencies necessary for legitimate, distributed systems of content governance. Through a combination of methods, legal schemes delegate at least a portion of the responsibility for governing online expression to private actors. Sometimes, statutory schemes assign regulatory tasks explicitly. In others, this delegation often occurs implicitly, with little guidance as to how the treatment of content should be structured. In the law's shadow, online platforms are largely given free rein to configure the governance of expression. Legal scholarship has surfaced important concerns about the private sector's role in content governance. In response, private platforms engaged in content moderation have adopted structures that mimic public governance forms. Yet, we largely lack the means to measure whether these forms are substantive, effectively infusing public values into the content moderation process, or merely symbolic artifice designed to deflect much needed public scrutiny. This Article's proposed framework addresses that gap in two ways. First, the framework considers together all manner of legal regimes that induce platforms to engage in the function of content moderation. Second, it focuses on the shared set of specific tasks, or subfunctions, involved in the content moderation function across these regimes. Examining a broad range of content moderation regimes together highlights the existence of distinct common tasks and decision points that together constitute the content moderation function. Focusing on this shared set of subfunctions highlights the different values implicated by each and the way they can be "handed off' to human and technical actors to perform in different ways with varying normative and political implications. This Article identifies four key content moderation subfunctions: (1) definition of policies, (2) identification of potentially covered content, (3) application of policies to specific cases, and (4) resolution of those cases. Using these four subfunctions supports a rigorous analysis of how to leverage the capacities and competencies of government and private parties throughout the content moderation process. Such attention also highlights how the exercise of that power can be constrained-either by requiring the use of particular decision-making processes or through limits on the use of automation-in ways that further address normative concerns. Dissecting the allocation of subfunctions in various content moderation regimes reveals the distinct ethical and political questions that arise in alternate configurations. Specifically, it offers a way to think about four key questions: (1) what values are most at issue regarding each subfunction; (2) which activities might be more appropriate to delegate to particular public or private actors; (3) which constraints must be attached to the delegation of each subfunction; and (4) where can investments in shared content moderation infrastructures support relevant values? The functional framework thus provides a means for both evaluating the symbolic legal forms that firms have constructed in service of content moderation and for designing processes that better reflect public values. 
    more » « less
  4. Efficient resource allocation and management can enhance the capacity of an optical backbone network. In this context, spectrum retuning via hitless defragmentation has been presented for elastic optical networks to enhance efficient spectrum accommodation while reducing the unused fragmented spaces in the spectrum. However, the quality of service committed in a service level agreement may be affected due to spectrum retuning. In particular, for transmission beyond the conventional C band, the presence of inter-channel stimulated Raman scattering can severely degrade the quality of the signal during defragmentation. To conquer this problem, this paper proposes, for the first time to our knowledge, a signal-quality-aware proactive defragmentation scheme for theC+Lband system. The proposed scheme prioritizes the minimization of the fragmentation index and quality of transmission (QoT) maintenance for two different defragmentation algorithms, namely, nonlinear-impairment (NLI)-aware defragmentation (NAD) and NLI-unaware defragmentation (NUD). We leverage machine learning techniques for QoT estimation of ongoing lightpaths during spectrum retuning. The optical signal-to-noise ratio of a lightpath is predicted for each choice of spectrum retuning, which helps to monitor the effect of defragmentation on the quality of ongoing lightpaths (in terms of assigned modulation format). Numerical results show that, compared to a baseline algorithm (NUD), the proposed NAD algorithm provides up to 15% capacity increment for smaller networks such as BT-UK, while for larger networks such as the 24-node USA network, a capacity benefit of 23% is achieved in terms of the number of served demands at 1% blocking.

     
    more » « less
  5. This paper focuses on optimizing resource allocation amongst a set of tenants, network slices, supporting dynamic customer loads over a set of distributed resources, e.g., base stations. The aim is to reap the benefits of statistical multiplexing resulting from flexible sharing of ‘pooled’ resources, while enabling tenants to differentiate and protect their performance from one another’s load fluctuations. To that end we consider a setting where resources are grouped into Virtual Resource Pools (VRPs) wherein resource allocation is jointly and dynam- ically managed. Specifically for each VRP we adopt a Share- Constrained Proportionally Fair (SCPF) allocation scheme where each tenant is allocated a fixed share (budget). This budget is to be distributed equally amongst its active customers which in turn are granted fractions of their associated VRP resources in proportion to customer shares. For a VRP with a single resource, this translates to the well known Generalized Processor Sharing (GPS) policy. For VRPs with multiple resources SCPF provides a flexible means to achieve load elastic allocations across tenants sharing the pool. Given tenants’ per resource shares and expected loads, this paper formulates the problem of determining optimal VRP partitions which maximize the overall expected shared weighted utility while ensuring protection guarantees. For a high load/capacity setting we exhibit this network utility function explicitly, quantifying the benefits and penalties of any VRP partition, in terms of network slices’ ability to achieve performance differentiation, load balancing, and statistical multiplexing. Although the problem is shown to be NP-Hard, a simple greedy heuristic is shown to be effective. Analysis and simulations confirm that the selection of optimal VRP partitions provide a practical avenue towards improving network utility in network slicing scenarios with dynamic loads. 
    more » « less