Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Abstract Street view imagery databases such as Google Street View, Mapillary, and Karta View provide great spatial and temporal coverage for many cities globally. Those data, when coupled with appropriate computer vision algorithms, can provide an effective means to analyse aspects of the urban environment at scale. As an effort to enhance current practices in urban flood risk assessment, this project investigates a potential use of street view imagery data to identify building features that indicate buildings’ vulnerability to flooding (e.g., basements and semi-basements). In particular, this paper discusses (1) building features indicating the presence of basement structures, (2) available imagery data sources capturing those features, and (3) computer vision algorithms capable of automatically detecting the features of interest. The paper also reviews existing methods for reconstructing geometry representations of the extracted features from images and potential approaches to account for data quality issues. Preliminary experiments were conducted, which confirmed the usability of the freely available Mapillary images for detecting basement railings as an example type of basement features, as well as geolocating the features.more » « less
- 
            Introduction: Without community-based, data-aggregation tools, timely and meaningful local input into brownfield management is not tenable, as relevant data are dispersed and often incomplete. in response, this project lays the groundwork through which constructive dialogue between community members and local officials can be facilitated.Materials and methods: a Brownfield engagement tool (Bet) is envisioned as a means by which non-experts can use disparately held open data streams to collect, analyse, and visualise brownfield site data, better understand aggregate health risks, and provide direct input into remediation and redevelopment decisions. By raising awareness and providing knowledge about brownfield related issues, the Bet is intended as a means to encourage community member participation in public debate. this concept is demonstrated for a 113-hectare Brooklyn, New York neighbourhood with a long history of industrial and mixed-use development resulting in 18 brownfields. the proposed remediation prioritization strategy offers a systematic analysis of the sites’ size, contaminants, and proximity to gathering spots and demographics.Results: the Bet proposed in this paper offers a novel approach for community-based management of brownfields done at the census tract level and based on factors that most affect the local community. By combining publicly-available, municipal, state, and federal data in the Bet, a set of easy-to-understand metrics can be generated through which a community can compare and rank existing brownfields to prioritize future interventions and can be used as a support system for raising funding and investments to address neighbourhood issues. this type of approach is the first of its kind with respect to brownfield redevelopment.more » « lessFree, publicly-accessible full text available December 31, 2026
- 
            This paper proposes a flood risk visualization method that is (1) readily transferable (2) hyperlocal, (3) computationally inexpensive, and (4) geometrically accurate. This proposal is for risk communication, to provide high-resolution, three-dimensional flood visualization at the sub-meter level. The method couples a laser scanning point cloud with algorithms that produce textured floodwaters, achieved through compounding multiple sine functions in a graphics shader. This hyper-local approach to visualization is enhanced by the ability to portray changes in (i) watercolor, (ii) texture, and (iii) motion (including dynamic heights) for various flood prediction scenarios. Through decoupling physics-based predictions from the visualization, a dynamic, flood risk viewer was produced with modest processing resources involving only a single, quad-core processor with a frequency around 4.30 GHz and with no graphics card. The system offers several major advantages. (1) The approach enables its use on a browser or with inexpensive, virtual reality hardware and, thus, promotes local dissemination for flood risk communication, planning, and mitigation. (2) The approach can be used for any scenario where water interfaces with the built environment, including inside of pipes. (3) When tested for a coastal inundation scenario from a hurricane, 92% of the neighborhood participants found it to be more effective in communicating flood risk than traditional 2D mapping flood warnings provided by governmental authorities.more » « lessFree, publicly-accessible full text available February 1, 2026
- 
            There is increasing evidence that climate change will lead to greater and more frequent extreme weather events, thus underscoring the importance of effectively communicating risks of record storm surges in coastal communities. This article reviews why risk communication often fails to convey the nature and risk of storm surge among the public and highlights the limitations of conventional (two-dimensional) storm surge flood maps. The research explores the potential of dynamic street-level, augmented scenes to increase the tangibility of these risks and foster a greater sense of agency among the public. The study focused on Sunset Park, a coastal community in southwest Brooklyn that is vulnerable to storm surges and flooding. Two different representations of flooding corresponding to a category three hurricane scenario were prepared: (1) a conventional two-dimensional flood map (“2D” control group) and (2) a, dynamic, street view simulation (“3D”test group). The street view simulations were found to be (1) more effective in conveying the magnitude of flooding and evacuation challenges, (2) easier to use for judging flood water depth (even without a flood depth legend), (3) capable of generating stronger emotional responses, and (4) perceived as more authoritative in naturemore » « less
- 
            Aerial images are a special class of remote sensing images, as they are intentionally collected with a high degree of overlap. This high degree of overlap complicates existing index strategies such as R-tree and Space Filling Curve (SFC) based index techniques due to complications in space splitting, granularity of the grid cells and excessive duplication of image object identifiers (IOIs). However, SFC based space ordering can be modified to provide scalable management of overlapping aerial images. This involves overcoming similar IOIs in adjacent grid cells, which would naturally occur in SFC based grids with such data. IOI duplication can be minimized by merging adjacent grid cells through the proposed “Designing Adjacent Cell Merge Algorithm” (DACMA). This work focuses on establishing a proper adjacent cell merge metric and merge percentage value. Using a highly scalable, distributed HBase cluster for both a single aerial mapping project, and multiple aerial mapping projects, experiments evaluated Jaccard Similarity (JS) and Percentage of Overlap (PO) merge metrics. JS had significant advantages: (i) generating smaller merged regions and (ii) obtaining over 21% and 36% improvement in reducing query response times compared to PO. As a result, JS is proposed for the merge metric for DACMA. For the merge percentage two considerations were dominant: (i) substantial storage reductions with respect to both straight forward SFC-based cell space indexing and 4SA based indexing, and (ii) minimal impact on the query response time. The proposed merge percentage value was selected to optimize the storage (i.e. space) needs and response time (i.e. time) herein named the “Space-Time Trade-off Optimization Percentage” value (or STOP value) is presented.more » « less
- 
            Current state-of-the-art point cloud data management (PCDM) systems rely on a variety of parallel architectures and diverse data models. The main objective of these implementations is achieving higher scalability without compromising performance. This paper reviews the scalability and performance of state-of-the-art PCDM systems with respect to both parallel architectures and data models. More specifically, in terms of parallel architectures, shared-memory architecture, shared-disk architecture, and shared-nothing architecture are considered. In terms of data models, relational models, and novel data models (such as wide-column models) are considered. New structured query language (NewSQL) models are considered. The impacts of parallel architectures and data models are discussed with respect to theoretical perspectives and in the context of existing PCDM implementations. Based on the review, a methodical approach for the selection of parallel architectures and data models for highly scalable and performance-efficient PCDM system development is proposed. Finally, notable research gaps in the PCDM literature are presented as possible directions for future research.more » « less
- 
            State-of-the-art, scalable, indexing techniques in location-based image data retrieval are primarily focused on supporting window and range queries. However, support of these indexes is not well explored when there are multiple spatially similar images to retrieve for a given geographic location. Adoption of existing spatial indexes such as the kD-tree pose major scalability impediments. In response, this work proposes a novel scalable, key-value, database oriented, secondary-memory based, spatial index to retrieve the top k most spatially similar images to a given geographic location. The proposed index introduces a 4-dimensional Hilbert index (4DHI). This space filling curve is implemented atop HBase (a key-value database). Experiments performed on both synthetically generated and real world data demonstrate comparable accuracy with MD-HBase (a state of the art, scalable, multidimensional point data management system) and better performance. Specifically, 4DHI yielded 34% - 39% storage improvements compared to the disk consumption of the original index of MD-HBase. The compactness in 4DHI also yielded up to 3.4 and 4.7 fold gains when retrieving 6400 and 12800 neighbours, respectively; compared to the adoption of original index of MD-HBase for respective neighbour searches. An optimization technique termed “Bounding Box Displacement” (BBD) is introduced to improve the accuracy of the top k approximations in relation to the results of in-memory kD-tree. Finally, a method of reducing row key length is also discussed for the proposed 4DHI to further improve the storage efficiency and scalability in managing large numbers of remotely sensed images.more » « less
- 
            Abstract. State-of-the-art remote sensing image management systems adopt scalable databases and employ sophisticated indexing techniques to perform window and containment queries. Many rely on space-filling curve (SFC) based index techniques designed for key-value databases and are predominantly employable for images that are iso-oriented. Critically, these indexes do not consider the high degree of overlap among images that exists in many data sets and the affiliated storage requirements. Specifically, employing an SFC-based grid cell index approach in consort with ground footprint coverage of the images requires storage of a unique image object identification (IOI) for each image in every grid cell where overlap occurs. Such an approach adversely affects both storage and query response times. In response, this paper presents an optimization technique for an SFC-based grid cell space indexing. The optimization is specifically designed for window and containment queries where the region of interest overlaps with at least a 2 × 2 grid of cells. The technique is based on four cell removal steps, thus called “four step algorithm” (4SA). Each step employs a unique spatial configuration to check for continuous spatial extent. If present, the IOI of the target cell is omitted from further consideration. Analysis and experiments on real world and synthetic image data demonstrated that 4SA improved storage demands by 41.3% – 47.8%. Furthermore, in the performed querying experiments, only 42% of IOI elements needed to be processed, thus yielding a 58% productivity gain. The reduction of IOI elements in querying also impacted the CPU execution time (3.0% – 5.2%). The 4SA also demonstrated data scalability and concurrent user scalability in querying large regions by completing the index searching and concurrent user scalability 1.86% – 3.35% faster than when 4SA was not applied.more » « less
- 
            Across coastal urban centres, underground spaces such as storage areas, transportation corridors, basement car parks, public facilities, retail & office and private spaces present a priority risk during flood events with respect to timely evacuation. However, these underground spaces are commonly not considered in urban flood prediction models, in many cases because the location and geometry of these underground spaces are often poorly known. In order to improve urban flood prediction models, various identified underground spaces have been included into the urban flood simulation presented in this paper. Here, the Software MIKE+ is adopted to simulate the coastal flood scenarios for the urban centre of the city of Belfast, Northern Ireland. In the simulation, unstructured triangular grids are used. Based on the numerical simulation, urban flood depth and flooding rates into the underground spaces can be obtained. Based on the comparison of simulated urban flood scenarios with and without underground spaces, the impact of underground spaces on street-level inundation and flood routing is evaluated. It can be observed that the inclusion of underground space has a significant impact on the flood routing process. Moreover, the underground spaces also present priority risk areas during flood events with respect to timely evacuation and to this end, underground spaces cannot be ignored in real urban flood prediction. The presented study can be used to increase communities’ emergency preparedness and flood resiliencemore » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                     Full Text Available
                                                Full Text Available