skip to main content


Search for: All records

Creators/Authors contains: "Joshaghani, Rezvan"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. null (Ed.)
    Internet usage continues to increase among children ages 12 and younger. Because their digital interactions can be persistently stored, there is a need for building an understanding and foundational knowledge of privacy. We describe initial investigations into children's understanding of privacy from a Contextual Integrity (CI) perspective by conducting semi-structured interviews. We share results -- that echo what others have shown -- that indicate children have limited knowledge and understanding of CI principles. We also share an initial exploration of utilizing participatory design theater as a possible educational mechanism to help children develop a stronger understanding of important privacy principles 
    more » « less
  2. null (Ed.)
    As our society has become more information oriented, each individual is expressed, defined, and impacted by information and information technology. While valuable, the current state-of-the-art mostly are designed to protect the enterprise/ organizational privacy requirements and leave the main actor, i.e., the user, un-involved or with the limited ability to have control over his/her information sharing practices. In order to overcome these limitations, algorithms and tools that provide a user-centric privacy management system to individuals with different privacy concerns are required to take into the consideration the dynamic nature of privacy policies which are constantly changing based on the information sharing context and environmental variables. This paper extends the concept of contextual integrity to provide mathematical models and algorithms that enables the creations and management of privacy norms for individual users. The extension includes the augmentation of environmental variables, i.e. time, date, etc. as part of the privacy norms, while introducing an abstraction and a partial relation over information attributes. Further, a formal verification technique is proposed to ensure privacy norms are enforced for each information sharing action. 
    more » « less
  3. null (Ed.)
    With the growth of Internet in many different aspects of life, users are required to share private information more than ever. Hence, users need a privacy management tool that can enforce complex and customized privacy policies. In this paper, we propose a privacy management system that not only allows users to define complex privacy policies for data sharing actions, but also monitors users' behavior and relationships to generate realistic policies. In addition, the proposed system utilizes formal modeling and model-checking approach to prove that information disclosures are valid and privacy policies are consistent with one another 
    more » « less
  4. null (Ed.)
    In this position paper, we argue for applying recent research on ensuring sociotechnical systems are fair and non-discriminatory to the privacy protections those systems may provide. Privacy literature seldom considers whether a proposed privacy scheme protects all persons uniformly, irrespective of membership in protected classes or particular risk in the face of privacy failure. Just as algorithmic decision-making systems may have discriminatory outcomes even without explicit or deliberate discrimination, so also privacy regimes may disproportionately fail to protect vulnerable members of their target population, resulting in disparate impact with respect to the effectiveness of privacy protections.We propose a research agenda that will illuminate this issue, along with related issues in the intersection of fairness and privacy, and present case studies that show how the outcomes of this research may change existing privacy and fairness research. We believe it is important to ensure that technologies and policies intended to protect the users and subjects of information systems provide such protection in an equitable fashion. 
    more » « less
  5. Personalized systems increasingly employ Privacy Enhancing Technologies (PETs) to protect the identity of their users. In this paper, we are interested in whether the cost-benefit tradeoff — the underlying economics of the privacy calculus — is fairly distributed, or whether some groups of people experience a lower return on investment for their privacy decisions. 
    more » « less