skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Stakeholders in the cloud computing value-chain : A socio-technical review of data breach literature
This paper is about stakeholders in the cloud computing value-chain. Early cloud computing literature focused on the technical aspect of the technology and viewed the provider and customer as essential value-chain stakeholders. The more users that use cloud services, the potential for data breaches increases. The review of the literature was carried out using a social-technical approach. Socio-technical theory encapsulates the social, technical and environmental dimensions of a system. The outcomes of the search indicated that there are two pertinent stakeholder types: operational and non-operational. Operational stakeholders include cloud providers, customers, enablers, resellers and third-party providers. Non-operational stakeholders include regulators, legislators, courts, non-government organisations, law enforcement, industry-standard bodies and end-users. The end-users are critically important in the cloud value-chain in that they rely on online services for everyday activities and have their data compromised. The cloud value-chain presents that cloud services encapsulate more than just technology services. The paper considers the complex stakeholder relationships and data breach issues, indicating the need for a better socio-technical response from the stakeholders within the value-chain.  more » « less
Award ID(s):
1828010
PAR ID:
10277401
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
2020 IEEE International Symposium on Technology and Society (ISTAS)
Page Range / eLocation ID:
290 to 293
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Despite the broader acknowledgement of shared responsibilities in emergency management, one of the hidden and overlooked issues in disaster literature is the integration of multi-sector stakeholder values: the things that are of importance to the stakeholders (e.g., safety, profit, electability). Stakeholders (e.g., different levels of government, the private sector, the non-profit sector, and the communities) hold numerous values with varying degrees of importance, forming a system of value priorities. Stakeholder values and value priorities—referred to as value systems—are not static in a disaster context; they are dynamic, time-sensitive, and event-driven. A more in-depth understanding of the dynamics of stakeholder value systems is surely needed to allow policy-makers to introduce more pro-active and timely measures towards more resilient communities. To address this need, this paper focuses on identifying and understanding the stakeholder values in the context of Hurricane Michael. Semi-structured interviews (n=24 with 30 interviewees) were conducted to understand what public and private stakeholders value in different phases of Hurricane Michael. Based on the interview results, ten stakeholder values were identified: safety, resource efficiency, natural resource preservation, culture preservation, community growth, community adaptability, community cohesion, social welfare improvement, personal achievement, and business development. This study advances the knowledge in the area of disasters by empirically investigating public and private stakeholder values across different phases of the disaster. Such knowledge will help practitioners implement disaster resilience strategies in a way that accounts for diverse stakeholder needs and priorities, thus facilitating human-centered decision making towards building more resilient communities. 
    more » « less
  2. Cloud computing services have enjoyed explosive growth over the last decade. Users are typically businesses and government agencies who are able to scale their storage and processing requirements, and choose from pre-defined services (e.g. specific software-as-a-service applications). But with this outsourcing has also come the potential for data breaches targeted at the end-user, typically consumers (e.g. who purchase goods at an online retail store), and citizens (e.g. who transact information for their social security needs). This paper briefly introduces U.S.-based cloud computing regulation, including the U.S. Health Insurance Portability and Accountability Act (HIPPA), the Gramm Leach Bliley Act (GLBA), and the U.S. Stored Communications Act (SCA). We present how data breach notification (DBN) works in the U.S. by examining three mini-case examples: the 2011 Sony PlayStation Network data breach, the 2015 Anthem Healthcare data breach, and the 2017 Equifax data breach. The findings of the paper show that there is a systemic failure to learn from past data breaches, and that data breaches not only affect business and government clients of cloud computing services but their respective end-user customer base. Finally, the level of sensitivity of data breaches is increasing, from cloud computing hacks on video game platforms, to the targeting of more lucrative network and computer crime abuses aiming at invasive private health and financial data. 
    more » « less
  3. Today’s problems require a plethora of analytics tasks to be conducted to tackle state-of-the-art computational challenges posed in society impacting many areas including health care, automotive, banking, natural language processing, image detection, and many more data analytics-related tasks. Sharing existing analytics functions allows reuse and reduces overall effort. However, integrating deployment frameworks in the age of cloud computing are often out of reach for domain experts. Simple frameworks are needed that allow even non-experts to deploy and host services in the cloud. To avoid vendor lock-in, we require a generalized composable analytics service framework that allows users to integrate their services and those offered in clouds, not only by one, but by many cloud compute and service providers.We report on work that we conducted to provide a service integration framework for composing generalized analytics frame-works on multi-cloud providers that we call our Generalized AI Service (GAS) Generator. We demonstrate the framework’s usability by showcasing useful analytics workflows on various cloud providers, including AWS, Azure, and Google, and edge computing IoT devices. The examples are based on Scikit learn so they can be used in educational settings, replicated, and expanded upon. Benchmarks are used to compare the different services and showcase general replicability. 
    more » « less
  4. Ever since the commercial offerings of the Cloud started appearing in 2006, the landscape of cloud computing has been undergoing remarkable changes with the emergence of many different types of service offerings, developer productivity enhancement tools, and new application classes as well as the manifestation of cloud functionality closer to the user at the edge. The notion of utility computing, however, has remained constant throughout its evolution, which means that cloud users always seek to save costs of leasing cloud resources while maximizing their use. On the other hand, cloud providers try to maximize their profits while assuring service-level objectives of the cloud-hosted applications and keeping operational costs low. All these outcomes require systematic and sound cloud engineering principles. The aim of this paper is to highlight the importance of cloud engineering, survey the landscape of best practices in cloud engineering and its evolution, discuss many of the existing cloud engineering advances, and identify both the inherent technical challenges and research opportunities for the future of cloud computing in general and cloud engineering in particular. 
    more » « less
  5. A majority of today's cloud services are independently operated by individual cloud service providers. In this approach, the locations of cloud resources are strictly constrained by the distribution of cloud service providers' sites. As the popularity and scale of cloud services increase, we believe this traditional paradigm is about to change toward further federated services, a.k.a., multi-cloud, due to the improved performance, reduced cost of compute, storage and network resources, as well as increased user demands. In this paper, we present COMET, a lightweight, distributed storage system for managing metadata on large scale, federated cloud infrastructure providers, end users, and their applications (e.g. HTCondor Cluster or Hadoop Cluster). We showcase use case from NSF's, Chameleon, ExoGENI and JetStream research cloud testbeds to show the effectiveness of COMET design and deployment. 
    more » « less