With the increasingly aging workforce in the construction industry, understanding age-related differences in the adoption and use of wearable devices is crucial, particularly as privacy concerns pose significant barriers to implementation. While previous studies have addressed challenges older workers experience in construction, limited research has explored how wearable technologies specifically impact workers across different age groups. This study investigates older and younger construction workers' attitudes, perceptions, and interactions with employer-provided wearable devices. A comprehensive survey examined key factors such as privacy concerns, data ownership issues, mental-related data privacy, and the effects of different feedback methods on safety and performance. The findings reveal that while both older and younger workers generally hold positive attitudes towards wearable devices, older workers exhibit significantly higher privacy concerns, particularly regarding collecting mental-related data, which can lead to psychological resistance. Additionally, visual feedback was identified by both groups as the most distracting, negatively impacting safety and performance, while haptic feedback emerged as the preferred method, having the least lowering effect. These insights emphasize the need for tailored strategies in designing and implementing wearable devices to address the distinct preferences and concerns of diverse age groups, ultimately improving safety and usability in high-risk construction environments.
more »
« less
Accounting for Privacy Pluralism: Lessons and Strategies from Community-Based Privacy Groups
The emergent, dynamic nature of privacy concerns in a shifting sociotechnical landscape creates a constant need for privacy-related resources and education. One response to this need is community-based privacy groups. We studied privacy groups that host meetings in diverse urban communities and interviewed the meeting organizers to see how they grapple with potentially varied and changeable privacy concerns. Our analysis identified three features of how privacy groups are organized to serve diverse constituencies: situating (finding the right venue for meetings), structuring (finding the right format/content for the meeting), and providing support (offering varied dimensions of assistance). We use these findings to inform a discussion of "privacy pluralism" as a perennial challenge for the HCI privacy research community, and we use the practices of privacy groups as an anchor for reflection on research practices.
more »
« less
- Award ID(s):
- 1814909
- PAR ID:
- 10485021
- Publisher / Repository:
- ACM
- Date Published:
- ISSN:
- 1062-9432
- ISBN:
- 9781450394215
- Page Range / eLocation ID:
- 1 to 12
- Format(s):
- Medium: X
- Location:
- Hamburg Germany
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Inspired by earlier academic research, iOS app privacy labels and the recent Google Play data safety labels have been introduced as a way to systematically present users with concise summaries of an app’s data practices. Yet, little research has been conducted to determine how well today’s mobile app privacy labels address people’s actual privacy concerns or questions. We analyze a crowd-sourced corpus of privacy questions collected from mobile app users to determine to what extent these mobile app labels actually address users’ privacy concerns and questions. While there are differences between iOS labels and Google Play labels, our results indicate that an important percentage of people’s privacy questions are not answered or only partially addressed in today’s labels. Findings from this work not only shed light on the additional fields that would need to be included in mobile app privacy labels but can also help inform refinements to existing labels to better address users’ typical privacy questions.more » « less
-
People who are marginalized experience disproportionate harms when their privacy is violated. Meeting their needs is vital for developing equitable and privacy-protective technologies. In response, research at the intersection of privacy and marginalization has acquired newfound urgency in the HCI and social computing community. In this literature review, we set out to understand how researchers have investigated this area of study. What topics have been examined, and how? What are the key findings and recommendations? And, crucially, where do we go from here? Based on a review of papers on privacy and marginalization published between 2010-2020 across HCI, Communication, and Privacy-focused venues, we make three main contributions: (1) we identify key themes in existing work and introduce the Privacy Responses and Costs framework to describe the tensions around protecting privacy in marginalized contexts, (2) we identify understudied research topics (e.g., race) and other avenues for future work, and (3) we characterize trends in research practices, including the under-reporting of important methodological choices, and provide suggestions to establish shared best practices for this growing research area.more » « less
-
The immersive nature of Virtual Reality (VR) and its reliance on sensory devices like head-mounted displays introduce privacy risks to users. While earlier research has explored users' privacy concerns within VR environments, less is known about users' comprehension of VR data practices and protective behaviors; the expanding VR market and technological progress also necessitate a fresh evaluation. We conducted semi-structured interviews with 20 VR users, showing their diverse perceptions regarding the types of data collected and their intended purposes. We observed privacy concerns in three dimensions: institutional, social, and device-specific. Our participants sought to protect their privacy through considerations when selecting the device, scrutinizing VR apps, and selective engagement in different VR interactions. We contrast our findings with observations from other technologies and ecosystems, shedding light on how VR has altered the privacy landscape for end-users. We further offer recommendations to alleviate users' privacy concerns, rectify misunderstandings, and encourage the adoption of privacy-conscious behaviors.more » « less
-
The General Data Protection Regulation (GDPR) in the European Union contains directions on how user data may be collected, stored, and when it must be deleted. As similar legislation is developed around the globe, there is the potential for repercussions across multiple fields of research, including educational data mining (EDM). Over the past two decades, the EDM community has taken consistent steps to protect learner privacy within our research, whilst pursuing goals that will benefit their learning. However, recent privacy legislation may cause our practices to need to change. The right to be forgotten states that users have the right to request that all their data (including deidentified data generated by them) be removed. In this paper, we discuss the potential challenges of this legislation for EDM research, including impacts on Open Science practices, Data Modeling, and Data sharing. We also consider changes to EDM best practices that may aid compliance with this new legislation.more » « less
An official website of the United States government
