Social media companies wield power over their users through design, policy, and through their participation in public discourse. We set out to understand how companies leverage public relations to influence expectations of privacy and privacy-related norms. To interrogate the discourse productions of companies in relation to privacy, we examine the blogs associated with three major social media platforms: Facebook, Instagram (both owned by Facebook Inc.), and Snapchat. We analyze privacy-related posts using critical discourse analysis to demonstrate how these powerful entities construct narratives about users and their privacy expectations. We find that each of these platforms often make use of discourse about "vulnerable" identities to invoke relations of power, while at the same time, advancing interpretations and values that favor data capitalism. Finally, we discuss how these public narratives might influence the construction of users' own interpretations of appropriate privacy norms and conceptions of self. We contend that expectations of privacy and social norms are not simply artifacts of users' own needs and desires, but co-constructions that reflect the influence of social media companies themselves.
more »
« less
"Privacy is not a concept, but a way of dealing with life": Localization of Transnational Technology Platforms and Liminal Privacy Practices in Cambodia
Privacy scholarship has shown how norms of appropriate information flow and information regulatory processes vary according to environment, which change as the environment changes, including through the introduction of new technologies. This paper describes findings from a qualitative research study that examines practices and perceptions of privacy in Cambodia as the population rapidly moves into an online environment (specifically Facebook, the most popular Internet tool in Cambodia today). We empirically demonstrate how the concept of privacy differs across cultures and show how the Facebook platform, as it becomes popular worldwide, catalyzes change in norms of information regulation. We discuss how the localization of transnational technology platforms provides a key site in which to investigate changing cultural ideas about privacy, and to discover misalignments between different expectations for information flow. Finally, we explore ways that insufficient localization effort by transnational technology companies puts some of the most marginalized users at disproportionate information disclosure risk when using new Internet tools, and offer some pragmatic suggestions for how such companies could improve privacy tools for users who are far -geographically or culturally - from where the tools are designed.
more »
« less
- Award ID(s):
- 1748903
- NSF-PAR ID:
- 10410272
- Date Published:
- Journal Name:
- Proceedings of the ACM on Human-Computer Interaction
- Volume:
- 3
- Issue:
- CSCW
- ISSN:
- 2573-0142
- Page Range / eLocation ID:
- 1 to 19
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Furnell, Steven (Ed.)A huge amount of personal and sensitive data is shared on Facebook, which makes it a prime target for attackers. Adversaries can exploit third-party applications connected to a user’s Facebook profile (i.e., Facebook apps) to gain access to this personal information. Users’ lack of knowledge and the varying privacy policies of these apps make them further vulnerable to information leakage. However, little has been done to identify mismatches between users’ perceptions and the privacy policies of Facebook apps. We address this challenge in our work. We conducted a lab study with 31 participants, where we received data on how they share information in Facebook, their Facebook-related security and privacy practices, and their perceptions on the privacy aspects of 65 frequently-used Facebook apps in terms of data collection, sharing, and deletion. We then compared participants’ perceptions with the privacy policy of each reported app. Participants also reported their expectations about the types of information that should not be collected or shared by any Facebook app. Our analysis reveals significant mismatches between users’ privacy perceptions and reality (i.e., privacy policies of Facebook apps), where we identified over-optimism not only in users’ perceptions of information collection, but also on their self-efficacy in protecting their information in Facebook despite experiencing negative incidents in the past. To the best of our knowledge, this is the first study on the gap between users’ privacy perceptions around Facebook apps and the reality. The findings from this study offer directions for future research to address that gap through designing usable, effective, and personalized privacy notices to help users to make informed decisions about using Facebook apps.more » « less
-
This paper explores how individuals' privacy-related decision-making processes may be influenced by their pre-existing relationships to companies in a wider social and economic context. Through an online role-playing exercise, we explore attitudes to a range of services including home automation, Internet-of-Things and financial services. We find that individuals do not only consider the privacy-related attributes of applications, devices or services in the abstract. Rather, their decisions are heavily influenced by their pre-existing perceptions of, and relationships with, the companies behind such apps, devices and services. In particular, perceptions about a company's size, level of regulatory scrutiny, relationships with third parties, and pre-existing data exposure lead some users to choose an option which might otherwise appear worse from a privacy perspective. This finding suggests a need for tools that support users to incorporate these existing perceptions and relationships into their privacy-related decision making.more » « less
-
null (Ed.)As our society has become more information oriented, each individual is expressed, defined, and impacted by information and information technology. While valuable, the current state-of-the-art mostly are designed to protect the enterprise/ organizational privacy requirements and leave the main actor, i.e., the user, un-involved or with the limited ability to have control over his/her information sharing practices. In order to overcome these limitations, algorithms and tools that provide a user-centric privacy management system to individuals with different privacy concerns are required to take into the consideration the dynamic nature of privacy policies which are constantly changing based on the information sharing context and environmental variables. This paper extends the concept of contextual integrity to provide mathematical models and algorithms that enables the creations and management of privacy norms for individual users. The extension includes the augmentation of environmental variables, i.e. time, date, etc. as part of the privacy norms, while introducing an abstraction and a partial relation over information attributes. Further, a formal verification technique is proposed to ensure privacy norms are enforced for each information sharing action.more » « less
-
People who are blind share their images and videos with companies that provide visual assistance technologies (VATs) to gain access to information about their surroundings. A challenge is that people who are blind cannot independently validate the content of the images and videos before they share them, and their visual data commonly contains private content. We examine privacy concerns for blind people who share personal visual data with VAT companies that provide descriptions authored by humans or artifcial intelligence (AI) . We frst interviewed 18 people who are blind about their perceptions of privacy when using both types of VATs. Then we asked the participants to rate 21 types of image content according to their level of privacy concern if the information was shared knowingly versus unknowingly with human- or AI-powered VATs. Finally, we analyzed what information VAT companies communicate to users about their collection and processing of users’ personal visual data through their privacy policies. Our fndings have implications for the development of VATs that safeguard blind users’ visual privacy, and our methods may be useful for other camera-based technology companies and their users.more » « less