skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Preparedness and Response in the Century of Disasters: Overview of Information Systems Research Frontiers
“The Century of Disasters” refers to the increased frequency, complexity, and magnitude of natural and man-made disasters witnessed in the 21st century: the impact of such disasters is exacerbated by infrastructure vulnerabilities, population growth/urbanization, and a challenging policy landscape. Technology-enabled disaster management (TDM) has an important role to play in the Century of Disasters. We highlight four important trends related to TDM, smart technologies and resilience, digital humanitarianism, integrated decision-support and agility, and artificial intelligence–enabled early warning systems, and how the confluence of these trends lead to four research frontiers for information systems researchers. We describe these frontiers, namely the technology-preparedness paradox, socio-technical crisis communication, predicting and prescribing under uncertainty, and fair pipelines, and discuss how the eight articles in the special section are helping us learn about these frontiers. History: Senior editor, Suprateek Sarker. Funding: This study was funded by the National Science Foundation (NSF) [Grants 2240347 and IIS-2039915]. H. R. Rao is also supported in part by the NSF [Grant 2020252]. The usual disclaimer applies.  more » « less
Award ID(s):
2240347
PAR ID:
10659452
Author(s) / Creator(s):
 ;  ;  ;  
Editor(s):
Sarker, S
Publisher / Repository:
INFORMS
Date Published:
Journal Name:
Information Systems Research
Volume:
35
Issue:
2
ISSN:
1047-7047
Page Range / eLocation ID:
460 to 468
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Sarker, S (Ed.)
    The Century of Disasters” refers to the increased frequency, complexity, and magnitude of natural and man-made disasters witnessed in the 21st century: the impact of such disasters is exacerbated by infrastructure vulnerabilities, population growth/urbanization, and a challenging policy landscape. Technology-enabled disaster management (TDM) has an important role to play in the Century of Disasters. We highlight four important trends related to TDM, smart technologies and resilience, digital humanitarianism, integrated decisionsupport and agility, and artificial intelligence–enabled early warning systems, and how the confluence of these trends lead to four research frontiers for information systems researchers. We describe these frontiers, namely the technology-preparedness paradox, socio-technical crisis communication, predicting and prescribing under uncertainty, and fair pipelines, and discuss how the eight articles in the special section are helping us learn about these frontiers. 
    more » « less
  2. Individually, both droughts and pandemics cause disruptions to global food supply chains. The 21st century has seen the frequent occurrence of both natural and human disasters, including droughts and pandemics. Together their impacts can be compounded, leading to severe economic stress and malnutrition, particularly in developing countries. Understanding how droughts and pandemics interact, and identifying appropriate policies to address them together and separately, is important for maintaining a robust global food supply. Hereinwe assess the impacts of each of these disasters in the context of food and agriculture, and then discuss their compounded effect. 
    more » « less
  3. Abstract Understanding population changes across long time scales and at fine spatiotemporal resolutions is important for confronting a broad suite of conservation challenges. However, this task is hampered by a lack of quality long‐term census data for multiple species collected across large geographic regions. Here, we used century‐long (1919–2018) data from the Audubon Christmas Bird Count (CBC) survey to assess population changes in over 300 avian species in North America and evaluate their temporal non‐stationarity. To estimate population sizes across the entire century, we employed a Bayesian hierarchical model that accounts for species detection probabilities, variable sampling effort, and missing data. We evaluated population trends using generalized additive models (GAMs) and assessed temporal non‐stationarity in the rate of population change by extracting the first derivatives from the fitted GAM functions. We then summarized the population dynamics across species, space, and time using a non‐parametric clustering algorithm that categorized individual population trends into four distinct trend clusters. We found that species varied widely in their population trajectories, with over 90% of species showing a considerable degree of spatial and/or temporal non‐stationarity, and many showing strong shifts in the direction and magnitude of population trends throughout the past century. Species were roughly equally distributed across the four clusters of population trajectories, although grassland, forest, and desert specialists more commonly showed declining trends. Interestingly, for many species, region‐wide population trends often differed from those observed at individual sites, suggesting that conservation decisions need to be tailored to fine spatial scales. Together, our results highlight the importance of considering spatial and temporal non‐stationarity when assessing long‐term population changes. More generally, we demonstrate the promise of novel statistical techniques for improving the utility and extending the temporal scope of existing citizen science datasets. 
    more » « less
  4. null (Ed.)
    Events such as Facebook-Cambridge Analytica scandal and data aggregation efforts by technology providers have illustrated how fragile modern society is to privacy violations. Internationally recognized entities such as the National Science Foundation (NSF) have indicated that Artificial Intelligence (AI)-enabled models, artifacts, and systems can efficiently and effectively sift through large quantities of data from legal documents, social media, Dark Web sites, and other sources to curb privacy violations. Yet considerable efforts are still required for understanding prevailing data sources, systematically developing AI-enabled privacy analytics to tackle emerging challenges, and deploying systems to address critical privacy needs. To this end, we provide an overview of prevailing data sources that can support AI-enabled privacy analytics; a multi-disciplinary research framework that connects data, algorithms, and systems to tackle emerging AI-enabled privacy analytics challenges such as entity resolution, privacy assistance systems, privacy risk modeling, and more; a summary of selected funding sources to support high-impact privacy analytics research; and an overview of prevailing conference and journal venues that can be leveraged to share and archive privacy analytics research. We conclude this paper with an introduction of the papers included in this special issue. 
    more » « less
  5. Flexible systems that can conform to any shape are desirable for wearable applications. Over the past decade, there have been tremendous advances in the domain of flexible electronics which enabled printing of devices, such as sensors on a flexible substrate. Despite these advances, pure flexible electronics systems are limited by poor performance and large feature sizes. Flexible hybrid electronics (FHE) is an emerging technology which addresses these issues by integrating high performance rigid integrated circuits and flexible devices. Yet, there are no system-level design flows and algorithms for the design of FHE systems. To this end, this paper presents a multi-objective design algorithm to implement a target application optimally using a library of rigid and flexible components. Our algorithm produces a set of Pareto frontiers that optimize the physical flexibility, energy per operation and area metrics. Simulation studies show a 32× range in area and 4× range in flexibility across the set of Pareto-optimal design points. 
    more » « less