In recent years, well-known cyber breaches have placed growing pressure on organizations to implement proper privacy and data protection standards. Attacks involving the theft of employee and customer personal information have damaged the reputations of well-known brands, resulting in significant financial costs. As a result, governments across the globe are actively examining and strengthening laws to better protect the personal data of its citizens. The General Data Protection Regulation (GDPR) updates European privacy law with an array of provisions that better protect consumers and require organizations to focus on accounting for privacy in their business processes through “privacy-by-design” and “privacy by default” principles. In the US, the National Privacy Research Strategy (NPRS), makes several recommendations that reinforce the need for organizations to better protect data. In response to these rapid developments in privacy compliance, data flow mapping has emerged as a valuable tool. Data flow mapping depicts the flow of data through a system or process, enumerating specific data elements handled, while identifying the risks at different stages of the data lifecycle. This Article explains the critical features of a data flow map and discusses how mapping may improve the transparency of the data lifecycle, while recognizing the limitations in building out data flow maps and the difficulties of maintaining updated maps. The Article then explores how data flow mapping may support data collection, transfer, storage, and destruction practices pursuant to various privacy regulations. Finally, a hypothetical case study is presented to show how data flow mapping was used by an organization to stay compliant with privacy rules and to improve the transparency of information flows 
                        more » 
                        « less   
                    
                            
                            Understanding civic and non-profit data through a custom data lifecycle
                        
                    
    
            This report details our experience creating a graphic to help track how data flows through our organization, DataWorks. DataWorks specializes in data cleaning and standardization services for civic and non-profits, while simultaneously functioning as a work-training program through which the data wranglers receive both training and a competitive hourly wage. As a result, the way data moves through DataWorks looks different than more traditional data clearinghouses, as those organizations often focus on all steps of the traditional data lifecycle. Through recounting our – data wranglers and researchers, with assistance from a design student – efforts to create the data lifecycle graphic, we describe the organization-specific properties of this data flow and theorize how it might apply to other organizations that assisting organizational initial “datafication” and maintenance 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1951818
- PAR ID:
- 10357242
- Date Published:
- Journal Name:
- Investigating Data Work Across Domains: New Perspectives on the Work of Creating Data workshop at CHI 2022
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Pacific evidence-based clinical and translational research is greatly needed. However, there are research challenges that stem from the creation, accessibility, availability, usability, and compliance of data in the Pacific. As a result, there is a growing demand for a complementary approach to the traditional Western research process in clinical and translational research. The data lifecycle is one such approach with a history of use in various other disciplines. It was designed as a data management tool with a set of activities that guide researchers and organizations on the creation, management, usage, and distribution of data. This manuscript describes the data lifecycle and its use by the Biostatistics, Epidemiology, and Research Design core data science team in support of the Center for Pacific Innovations, Knowledge, and Opportunities program.more » « less
- 
            Compliance reviews within a software organization are internal attempts to verify regulatory and security requirements during product development before its release. However, these reviews are not enough to adequately assess and address regulatory and security requirements throughout a software’s development lifecycle. We believe requirements engineers can benefit from an improved understanding of how software practitioners treat and perceive compliance requirements. This paper describes an interview study seeking to understand how regulatory and security standard requirements are addressed, how burdensome they may be for businesses, and how our participants perceived them in the software development lifecycle. We interviewed 15 software practitioners from 13 organizations with different roles in the software development process and working in various industry domains, including big tech, healthcare, data analysis, finance, and small businesses. Our findings suggest that, for our participants, the software release process is the ultimate focus for regulatory and security compliance reviews. Also, most participants suggested that having a defined process for addressing compliance requirements was freeing rather than burdensome. Finally, participants generally saw compliance requirements as an investment for both employees and customers. These findings may be unintuitive, and we discuss seven lessons this work may hold for requirements engineering.more » « less
- 
            This study aims to examine the impacts of organization’s cybersecurity training program on employees with qualitative data, collected from 33 college students who were attending Norfolk State University while also working either on a part-time or full-time basis participated. Open-ended questions were asked to elicit participants’ perspectives on cybersecurity training and cybersecurity protocols in organizations. Using qualitative data analysis software Nvivo 12, the authors organized and analyzed the collected data with open coding, and selective coding to recognize the major influencing impacts from cybersecurity training on employees’ routine work and behavior. Inductive and grounded theory analysis further elaborates connections between employee’s cybersecurity training and efficiency of organizations. Our findings suggest that on-the-job cybersecurity training provided by the employer is an effective investment for modern organizations to build on the organizational human capital and consequently to improve the efficiency of the organization. Findings from this study corroborates with the tenet of human capital theory that on-the-job educational program or training is economical and effective to manage the human capital challenge for modern organizations.more » « less
- 
            A good performance monitoring system is crucial to knowing whether an organization's efforts are making their data capabilities better, the same, or worse. However, comprehensive performance measurements are costly. Organizations need to expend time, resources, and personnel to design the metrics, to gather evidence for the metrics, to assess the metrics' value, and to determine if any actions should be taken as a result of those metrics. Consequently organizations need to be strategic in selecting their portfolio of performance indicators for evaluating how well their data initiatives are producing value to the organization. This paper proposes a balanced scorecard approach to aid organizations in designing a set of meaningful and coordinated metrics for maximizing the potential of their data assets. This paper also discusses implementation challenges and the need for further research in this area.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    