skip to main content


Title: Toward a Theory of Harms in the Internet Ecosystem
One foundational justification for regulatory intervention is that there are harms occurring of a character that create a public interest in mitigating them. This paper is concerned with such harms that arise in the Internet ecosystem. Looking at news headlines for the last few years, it may seem that the range of such harms is unbounded. Hoping to add some order to the chaos, we undertake an effort to classify harms in the Internet ecosystem, in pursuit of a more or less complete taxonomy of harms. Our goal in structuring this taxonomy can help to mitigate harms in a more systematic way, as opposed to fighting an endless defensive battle against whatever happens next. The background we bring to this paper is on the one hand architectural—how the Internet ecosystem is actually structured—and on the other hand empirical—how we should measure the Internet to best understand what is happening. If everything were wonderful about the Internet today, the need to measure and understand would not be so compelling. A justification for measurement follows from its ability to shed light on problems and challenges. Sustained measurement or compelled reporting of data, and the analysis of the collected data, generally comes at considerable effort and cost, so must be justified by an argument that it will shed light on something important. This reasoning naturally motivates our taxonomy of things that are wrong—what we call harms. That is where we, the research community generally, and governments should focus attention. We do not intend this paper as a catalog of pessimism, but to help define an action agenda for the research community and for governments. The structure of the paper proceeds "up the layers'', from technology to society. For harms that are closer to the technology, we can be more specific about the harms, and more specific about possible measurements and remedies, and actors that could undertake them. One motivation for this paper is that we believe the Internet ecosystem is at an inflection point. The Internet has revolutionized our ability to store, move, and process information, including information about people, and we are only at the beginning of understanding its impact on society and how to manage and mitigate harms resulting from unregulated commercial use of these capabilities. Current events suggest that now is a point of transition from laissez-faire to regulation. However, the path to good regulation is not obvious, and now is the time for the research community to think hard about what advice to give the governments of the world, and what sort of data can back up that advice. Our highest-level goal for this paper is to contribute to a conversation along those lines.  more » « less
Award ID(s):
1724853
NSF-PAR ID:
10186676
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Social Science Research Network
ISSN:
1556-5068
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Who and by what means do we ensure that engineering education evolves to meet the ever changing needs of our society? This and other papers presented by our research team at this conference offer our initial set of findings from an NSF sponsored collaborative study on engineering education reform. Organized around the notion of higher education governance and the practice of educational reform, our open-ended study is based on conducting semi-structured interviews at over three dozen universities and engineering professional societies and organizations, along with a handful of scholars engaged in engineering education research. Organized as a multi-site, multi-scale study, our goal is to document differences in perspectives and interest the exist across organizational levels and institutions, and to describe the coordination that occurs (or fails to occur) in engineering education given the distributed structure of the engineering profession. This paper offers for all engineering educators and administrators a qualitative and retrospective analysis of ABET EC 2000 and its implementation. The paper opens with a historical background on the Engineers Council for Professional Development (ECPD) and engineering accreditation; the rise of quantitative standards during the 1950s as a result of the push to implement an engineering science curriculum appropriate to the Cold War era; EC 2000 and its call for greater emphasis on professional skill sets amidst concerns about US manufacturing productivity and national competitiveness; the development of outcomes assessment and its implementation; and the successive negotiations about assessment practice and the training of both of program evaluators and assessment coordinators for the degree programs undergoing evaluation. It was these negotiations and the evolving practice of assessment that resulted in the latest set of changes in ABET engineering accreditation criteria (“1-7” versus “a-k”). To provide an insight into the origins of EC 2000, the “Gang of Six,” consisting of a group of individuals loyal to ABET who used the pressure exerted by external organizations, along with a shared rhetoric of national competitiveness to forge a common vision organized around the expanded emphasis on professional skill sets. It was also significant that the Gang of Six was aware of the fact that the regional accreditation agencies were already contemplating a shift towards outcomes assessment; several also had a background in industrial engineering. However, this resulted in an assessment protocol for EC 2000 that remained ambiguous about whether the stated learning outcomes (Criterion 3) was something faculty had to demonstrate for all of their students, or whether EC 2000’s main emphasis was continuous improvement. When it proved difficult to demonstrate learning outcomes on the part of all students, ABET itself began to place greater emphasis on total quality management and continuous process improvement (TQM/CPI). This gave institutions an opening to begin using increasingly limited and proximate measures for the “a-k” student outcomes as evidence of effort and improvement. In what social scientific terms would be described as “tactical” resistance to perceived oppressive structures, this enabled ABET coordinators and the faculty in charge of degree programs, many of whom had their own internal improvement processes, to begin referring to the a-k criteria as “difficult to achieve” and “ambiguous,” which they sometimes were. Inconsistencies in evaluation outcomes enabled those most discontented with the a-k student outcomes to use ABET’s own organizational processes to drive the latest revisions to EAC accreditation criteria, although the organization’s own process for member and stakeholder input ultimately restored much of the professional skill sets found in the original EC 2000 criteria. Other refinements were also made to the standard, including a new emphasis on diversity. This said, many within our interview population believe that EC 2000 had already achieved much of the changes it set out to achieve, especially with regards to broader professional skills such as communication, teamwork, and design. Regular faculty review of curricula is now also a more routine part of the engineering education landscape. While programs vary in their engagement with ABET, there are many who are skeptical about whether the new criteria will produce further improvements to their programs, with many arguing that their own internal processes are now the primary drivers for change. 
    more » « less
  2. CYBERSECURITY AND LOCAL GOVERNMENT Learn to secure your local government’s networks with this one-of-a-kind resource In Cybersecurity and Local Government, a distinguished team of researchers delivers an insightful exploration of cybersecurity at the level of local government. The book makes a compelling argument that every local government official, elected or otherwise, must be reasonably knowledgeable about cybersecurity concepts and provide appropriate support for it within their governments. It also lays out a straightforward roadmap to achieving those objectives, from an overview of cybersecurity definitions to descriptions of the most common security challenges faced by local governments. The accomplished authors specifically address the recent surge in ransomware attacks and how they might affect local governments, along with advice as to how to avoid and respond to these threats. They also discuss the cybersecurity law, cybersecurity policies that local government should adopt, the future of cybersecurity, challenges posed by Internet of Things, and much more. Throughout, the authors provide relevant field examples, case studies of actual local governments, and examples of policies to guide readers in their own application of the concepts discussed within. Cybersecurity and Local Government also offers: A thorough introduction to cybersecurity generally, including definitions of key cybersecurity terms and a high-level overview of the subject for non-technologists. A comprehensive exploration of critical information for local elected and top appointed officials, including the typical frequencies and types of cyberattacks. Practical discussions of the current state of local government cybersecurity, with a review of relevant literature from 2000 to 2021. In-depth examinations of operational cybersecurity policies, procedures and practices, with recommended best practices. Perfect for local elected and top appointed officials and staff as well as local citizens, Cybersecurity and Local Government will also earn a place in the libraries of those studying or working in local government with an interest in cybersecurity. 
    more » « less
  3. To mitigate IPv4 exhaustion, IPv6 provides expanded address space, and NAT allows a single public IPv4 address to suffice for many devices assigned private IPv4 address space. Even though NAT has greatly extended the shelf-life of IPv4, some networks need more private IPv4 space than what is officially allocated by IANA due to their size and/or network management practices. Some of these networks resort to using squat space , a term the network operations community uses for large public IPv4 address blocks allocated to organizations but historically never announced to the Internet. While squatting of IP addresses is an open secret, it introduces ethical, legal, and technical problems. In this work we examine billions of traceroutes to identify thousands of organizations squatting. We examine how they are using it and what happened when the US Department of Defense suddenly started announcing what had traditionally been squat space. In addition to shining light on a dirty secret of operational practices, our paper shows that squatting distorts common Internet measurement methodologies, which we argue have to be re-examined to account for squat space. 
    more » « less
  4. Purpose The authors use a co-auto-ethnographic study of Hurricane Harvey where both authors were citizen responders and disaster researchers. In practice, large-scale disaster helps temporarily foster an ideal of community which is then appropriated by emergency management institutions. The advancement of disaster research must look to more radical perspectives on human response in disaster and what this means for the formation of communities and society itself. It is the collective task as those invested in the management of crises defer to the potentials of publics, rather than disdain and appropriate them. The authors present this work in the advancement of more empirically informed mitigation of societal ills that produce major causes of disaster. The authors’ work presents a departure from the more traditional disaster work into a critical and theoretical realm using novel research methods. The paper aims to discuss these issues. Design/methodology/approach This paper produces a co-auto-ethnographic study of Hurricane Harvey where both authors were citizen responders and disaster researchers. Findings The authors provide a critical, theoretical argument that citizen-based response fosters an ephemeral utopia not usually experienced in everyday life. Disasters present the possibility of an ideal of community. These phenomena, in part, allow us to live our better selves in the case of citizen response and provide a direct contrast to the modern experience. Modernity is a mostly fabricated, if not almost eradicated sense of community. Modern institutions, serve as sources of domination built on the backs of technology, continuity of infrastructures and self-sufficiency when disasters handicap society, unpredictability breaks illusions of modernity. There arises a need to re-engage with those around us in meaningful and exciting ways. Research limitations/implications This work produces theory rather than engage in testing theory. It is subject to all the limitations of interpretive work that focuses on meaning and critique rather than advancing associations or causality. Practical implications The authors suggest large-scale disasters will persist to overwhelm management institutions no matter how much preparedness and planning occurs. The authors also offer an alternative suggestion to the institutional status quo system based on the research; let the citizenry do what they already do, whereas institutions focus more on mitigate of social ills that lead to disaster. This is particularly urgent given increasing risk of events exacerbated by anthropogenic causes. Social implications The advancement of disaster research must look to more radical perspectives on human response in disaster and what this means for the formation of communities and society itself. It is the collective task as those invested in the management of crises to defer to the potentials of publics, rather than disdain and appropriate them. The authors also suggest that meaningful mitigation of social ills that recognize and emphasize difference will be the only way to manage future large-scale events. Originality/value The authors’ work presents a departure from the more practical utility of disaster work into a critical and highly theoretical realm using novel research methods. 
    more » « less
  5. The arrival of undersea cables along the coasts of Africa over the last decade, combined with increased investment in national fiber backbones, has expedited the development of NRENs across the African continent. According to a World Bank report, there are now more than 15 NRENs operating in Africa and a dozen more in an advanced planning stage. In addition, recent investments have been made by governments and NGOs to use this new infrastructure to help connect researchers around the world to their colleagues in Africa. As part of its International Research Network Connections (IRNC) program, the US National Science Foundation is funding transatlantic bandwidth, targeted training, and pro-active application engagement in support of science collaborations in Africa. Similarly, the European Commission, via GEANT¹s AfricaConnect2 project, is providing support for the development of high-capacity internet networks and services for research and education across Africa. This increased support is helping improve connectivity for existing science collaborations while also enabling new collaborations to take advantage of the growing Research and Education infrastructure. This session will highlight how the global R & E networking community is working together to strengthen and support NRENs and research in Africa. Speakers will include IRNC PIs, representatives from GEANT, and African REN partners (Ubuntunet Alliance, WACREN, ASREN). Speakers will provide infrastructure updates, lessons learned from human capacity building workshops, reports on researcher engagement, and answer questions about current and future efforts. We will also highlight some of the challenges African NRENs and researchers working in Africa face and lead a discussion on how we can work together to begin addressing some of these challenges. 
    more » « less