skip to main content


Title: Immersive Search: Interactive Information Retrieval in Three-Dimensional Space
Researchers in interactive information retrieval (IIR) have studied and refined 2D presentations of search results for years. Recent advances are bringing augmented reality (AR) and virtual reality (VR) to real-world systems, though the IIR community has done relatively little work to explore and understand aspects of 3D presentations of search results, effects of immersive environments, and the impacts of spatial cognition and different spatial arrangements of results displays in 3D. In the research proposed here, I outline my plan to use immerse environments to investigate how users’ spatial cognition may influence the information retrieval process. Specifically, this work will observe how spatial arrangements of search results affect users’ ability to find information in the postquery, visual search phase of the IIR process across quantitative and qualitative measures.  more » « less
Award ID(s):
1718295
NSF-PAR ID:
10188541
Author(s) / Creator(s):
Date Published:
Journal Name:
Proceedings of the 2020 Conference on Human Information Interaction and Retrieval
Page Range / eLocation ID:
503 to 506
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. In this paper, we present results from an exploratory study to investigate users’ behaviors and preferences for three different styles of search results presentation in a virtual reality (VR) head-mounted display (HMD). Prior work in 2D displays has suggested possible benefits of presenting information in ways that exploit users’ spatial cognition abilities. We designed a VR system that displays search results in three different spatial arrangements: a list of 8 results, a 4x5 grid, and a 2x10 arc. These spatial display conditions were designed to differ in terms of the number of results displayed per page (8 vs 20) and the amount of head movement required to scan the results (list < grid < arc). Thirty-six participants completed 6 search trials in each display condition (18 total). For each trial, the participant was presented with a display of search results and asked to find a given target result or to indicate that the target was not present. We collected data about users’ behaviors with and perceptions about the three display conditions using interaction data, questionnaires, and interviews. We explore the effects of display condition and target presence on behavioral measures (e.g., completion time, head movement, paging events, accuracy) and on users’ perceptions (e.g., workload, ease of use, comfort, confidence, difficulty, and lostness). Our results suggest that there was no difference in accuracy among the display conditions, but that users completed tasks more quickly using the arc. However, users also expressed lower preferences for the arc, instead preferring the list and grid displays. Our findings extend prior research on visual search into to the area of 3-dimensional result displays for interactive information retrieval in VR HMD environments. 
    more » « less
  2. Search tasks play an important role in the study and development of interactive information retrieval (IIR) systems. Prior work has examined how search tasks vary along dimensions such as the task’s main activity, end goal, structure, and complexity. Recently, researchers have been exploring task complexity from the perspective of cognitive complexity—related to the types (and variety) of mental activities required by the task. Anderson & Krathwohl’s two-dimensional taxonomy of learning has been a commonly used framework for investigating tasks from the perspective of cognitive complexity [1]. A&K’s 2D taxonomy involves a cognitive process dimension and an orthogonal knowledge dimension. Prior IIR research has successfully leveraged the cognitive process dimension of this 2D taxonomy to develop search tasks and investigate their effects on searchers’ needs, perceptions, and behaviors. However, the knowledge dimension of the taxonomy has been largely ignored. In this conceptual paper, we argue that future IIR research should consider both dimensions of A&K’s taxonomy. Specifically, we discuss related work, present details on both dimensions of A&K’s taxonomy, and explain how to use the taxonomy to develop search tasks and learning assessment materials. Additionally, we discuss how considering both dimensions of A&K’s taxonomy has important implications for future IIR research. 
    more » « less
  3. null (Ed.)
    Immersive technologies such as Virtual Reality (VR) and Augmented Reality (AR) have become the worldwide huge technological innovations impacting human life significantly. While the VR is an enclosed environment separated completely from the real world, AR allows users to merge the digital and physical worlds and enable the interaction between them. The wide usage of AR has led researchers to investigate its potential capability in several areas including STEM-related fields. Previous research shows that AR assisted courses tend to enhance students’ learning, spatial cognition, increase the students’ motivation and engagement in the learning process. In this study, the researchers have developed an AR application to assist students with spatial cognition and remote course engagement independently. The ARCADE tool enables students to not only visualize the isometric product from its orthogonal views, but it also provides short tutorial clips on how a specific feature was developed and what tools were used. The students can perform basic modifications on the 3D part in the ARCADE such as section views, details views, scale, rotate and explode the assembly views. Although this project is a work in progress, the preliminary pretest and posttest results show there is a significant improvement in students’ spatial cognition when the proposed tool is used to assist the course. 
    more » « less
  4. Annotation in 3D user interfaces such as Augmented Reality (AR) and Virtual Reality (VR) is a challenging and promising area; however, there are not currently surveys reviewing these contributions. In order to provide a survey of annotations for Extended Reality (XR) environments, we conducted a structured literature review of papers that used annotation in their AR/VR systems from the period between 2001 and 2021. Our literature review process consists of several filtering steps which resulted in 103 XR publications with a focus on annotation. We classified these papers based on the display technologies, input devices, annotation types, target object under annotation, collaboration type, modalities, and collaborative technologies. A survey of annotation in XR is an invaluable resource for researchers and newcomers. Finally, we provide a database of the collected information for each reviewed paper. This information includes applications, the display technologies and its annotator, input devices, modalities, annotation types, interaction techniques, collaboration types, and tasks for each paper. This database provides a rapid access to collected data and gives users the ability to search or filter the required information. This survey provides a starting point for anyone interested in researching annotation in XR environments. 
    more » « less
  5. null (Ed.)
    In this paper, we demonstrate the Information Interactions in Virtual Reality (IIVR) system designed and implemented to study how users interact with abstract information objects in immersive virtual environments in the context of information retrieval. Virtual reality displays are quickly growing as social and personal computing media, and understanding user interactions in these immersive environments is imperative. As a step towards effective information retrieval in such emerging platforms, our system is central to upcoming studies to observe how users engage in information triaging tasks in Virtual Reality (VR). In these studies, we will observe the effects of (1) information layouts and (2) types of interactions in VR. We believe this early system motivates researchers in understanding and designing meaningful interactions for future VR information retrieval applications. 
    more » « less