skip to main content


Search for: All records

Creators/Authors contains: "Watkins, Elizabeth"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. In widely used sociological descriptions of how accountability is structured through institutions, an “actor” (e.g., the developer) is accountable to a “forum” (e.g., regulatory agencies) empowered to pass judgements on and demand changes from the actor or enforce sanctions. However, questions about structuring accountability persist: why and how is a forum compelled to keep making demands of the actor when such demands are called for? To whom is a forum accountable in the performance of its responsibilities, and how can its practices and decisions be contested? In the context of algorithmic accountability, we contend that a robust accountability regime requires a triadic relationship, wherein the forum is also accountable to another entity: the public(s). Typically, as is the case with environmental impact assessments, public(s) make demands upon the forum's judgements and procedures through the courts, thereby establishing a minimum standard of due diligence. However, core challenges relating to: (1) lack of documentation, (2) difficulties in claiming standing, and (3) struggles around admissibility of expert evidence on and achieving consensus over the workings of algorithmic systems in adversarial proceedings prevent the public from approaching the courts when faced with algorithmic harms. In this paper, we demonstrate that the courts are the primary route—and the primary roadblock—in the pursuit of redress for algorithmic harms. Courts often find algorithmic harms non-cognizable and rarely require developers to address material claims of harm. To address the core challenges of taking algorithms to court, we develop a relational approach to algorithmic accountability that emphasizes not what the actors do nor the results of their actions, but rather how interlocking relationships of accountability are constituted in a triadic relationship between actors, forums, and public(s). As is the case in other regulatory domains, we believe that impact assessments (and similar accountability documentation) can provide the grounds for contestation between these parties, but only when that triad is structured such that the public(s) are able to cohere around shared experiences and interests, contest the outcomes of algorithmic systems that affect their lives, and make demands upon the other parties. Where courts now find algorithmic harms non-cognizable, an impact assessment regime can potentially create procedural rights to protect substantive rights of the public(s). This would require algorithmic accountability policies currently under consideration to provide the public(s) with adequate standing in courts, and opportunities to access and contest the actor's documentation and the forum's judgments. 
    more » « less
    Free, publicly-accessible full text available June 12, 2024
  2. We investigate the privacy practices of labor organizers in the computing technology industry and explore the changes in these practices as a response to remote work. Our study is situated at the intersection of two pivotal shifts in workplace dynamics: (a) the increase in online workplace communications due to remote work, and (b) the resurgence of the labor movement and an increase in collective action in workplaces-especially in the tech industry, where this phenomenon has been dubbed the tech worker movement. The shift of work-related communications to online digital platforms in response to an increase in remote work is creating new opportunities for and risks to the privacy of workers. These risks are especially significant for organizers of collective action, with several well-publicized instances of retaliation against labor organizers by companies. Through a series of qualitative interviews with 29 tech workers involved in collective action, we investigate how labor organizers assess and mitigate risks to privacy while engaging in these actions. Among the most common risks that organizers experienced are retaliation from their employer, lateral worker conflict, emotional burnout, and the possibility of information about the collective effort leaking to management. Depending on the nature and source of the risk, organizers use a blend of digital security practices and community-based mechanisms. We find that digital security practices are more relevant when the threat comes from management, while community management and moderation are central to protecting organizers from lateral worker conflict. Since labor organizing is a collective rather than individual project, individual privacy and collective privacy are intertwined, sometimes in conflict and often mutually constitutive. Notions of privacy that solely center individuals are often incompatible with the needs of organizers, who noted that safety in numbers could only be achieved when workers presented a united front to management. Based on our interviews, we identify key topics for future research, such as the growing prevalence of surveillance software and the needs of international and gig worker organizers.We conclude with design recommendations that can help create safer, more secure and more private tools to better address the risks that organizers face. 
    more » « less
  3. ABSTRACT

    Galactic bars can drive cold gas inflows towards the centres of galaxies. The gas transport happens primarily through the so-called bar dust lanes, which connect the galactic disc at kpc scales to the nuclear rings at hundreds of pc scales much like two gigantic galactic rivers. Once in the ring, the gas can fuel star formation activity, galactic outflows, and central supermassive black holes. Measuring the mass inflow rates is therefore important to understanding the mass/energy budget and evolution of galactic nuclei. In this work, we use CO datacubes from the PHANGS-ALMA survey and a simple geometrical method to measure the bar-driven mass inflow rate on to the nuclear ring of the barred galaxy NGC 1097. The method assumes that the gas velocity in the bar lanes is parallel to the lanes in the frame co-rotating with the bar, and allows one to derive the inflow rates from sufficiently sensitive and resolved position–position–velocity diagrams if the bar pattern speed and galaxy orientations are known. We find an inflow rate of $\dot{M}=(3.0 \pm 2.1)\, \rm M_\odot \, yr^{-1}$ averaged over a time span of 40 Myr, which varies by a factor of a few over time-scales of ∼10 Myr. Most of the inflow appears to be consumed by star formation in the ring, which is currently occurring at a star formation rate (SFR) of $\simeq\!1.8\!-\!2 \, \rm M_\odot \, yr^{-1}$, suggesting that the inflow is causally controlling the SFR in the ring as a function of time.

     
    more » « less
  4. null (Ed.)
    Algorithmic impact assessments (AIA) are increasingly being proposed as a mechanism for algorithmic accountability. These assessments are seen as potentially useful for anticipating, avoiding, and mitigating the negative consequences of algorithmic decision-making systems (ADS). At the same time, what an AIA would entail remains under-specified. While promising, AIAs raise as many questions as they answer. Choices about the methods, scope, and purpose of impact assessments structure the possible governance outcomes. Decisions about what type of effects count as an impact, when impacts are assessed, whose interests are considered, who is invited to participate, who conducts the assessment, the public availability of the assessment, and what the outputs of the assessment might be all shape the forms of accountability that AIA proponents seek to encourage. These considerations remain open, and will determine whether and how AIAs can function as a viable governance mechanism in the broader algorithmic accountability toolkit, especially with regard to furthering the public interest. Because AlAs are still an incipient governance strategy, approaching them as social constructions that do not require a single or universal approach offers a chance to produce interventions that emerge from careful deliberation. 
    more » « less
  5. null (Ed.)
    Algorithmic impact assessments (AIAs) are an emergent form of accountability for entities that build and deploy automated decision-support systems. These are modeled after impact assessments in other domains. Our study of the history of impact assessments shows that "impacts" are an evaluative construct that enable institutions to identify and ameliorate harms experienced because of a policy decision or system. Every domain has different expectations and norms about what constitutes impacts and harms, how potential harms are rendered as the impacts of a particular undertaking, who is responsible for conducting that assessment, and who has the authority to act on the impact assessment to demand changes to that undertaking. By examining proposals for AIAs in relation to other domains, we find that there is a distinct risk of constructing algorithmic impacts as organizationally understandable metrics that are nonetheless inappropriately distant from the harms experienced by people, and which fall short of building the relationships required for effective accountability. To address this challenge of algorithmic accountability, and as impact assessments become a commonplace process for evaluating harms, the FAccT community should A) understand impacts as objects constructed for evaluative purposes, B) attempt to construct impacts as close as possible to actual harms, and C) recognize that accountability governance requires the input of various types of expertise and affected communities. We conclude with lessons for assembling cross-expertise consensus for the co-construction of impacts and to build robust accountability relationships. 
    more » « less
  6. Abstract

    We measure the molecular gas environment near recent (<100 yr old) supernovae (SNe) using ∼1″ or ≤150 pc resolution CO (2–1) maps from the PHANGS–Atacama Large Millimeter/submillimeter Array (ALMA) survey of nearby star-forming galaxies. This is arguably the first such study to approach the scales of individual massive molecular clouds (Mmol≳ 105.3M). Using the Open Supernova Catalog, we identify 63 SNe within the PHANGS–ALMA footprint. We detect CO (2–1) emission near ∼60% of the sample at 150 pc resolution, compared to ∼35% of map pixels with CO (2–1) emission, and up to ∼95% of the SNe at 1 kpc resolution, compared to ∼80% of map pixels with CO (2–1) emission. We expect the ∼60% of SNe within the same 150 pc beam, as a giant molecular cloud will likely interact with these clouds in the future, consistent with the observation of widespread SN–molecular gas interaction in the Milky Way, while the other ∼40% of SNe without strong CO (2–1) detections will deposit their energy in the diffuse interstellar medium, perhaps helping drive large-scale turbulence or galactic outflows. Broken down by type, we detect CO (2–1) emission at the sites of ∼85% of our 9 stripped-envelope SNe (SESNe), ∼40% of our 34 Type II SNe, and ∼35% of our 13 Type Ia SNe, indicating that SESNe are most closely associated with the brightest CO (2–1) emitting regions in our sample. Our results confirm that SN explosions are not restricted to only the densest gas, and instead exert feedback across a wide range of molecular gas densities.

     
    more » « less
  7. Abstract

    We use PHANGS–James Webb Space Telescope (JWST) data to identify and classify 1271 compact 21μm sources in four nearby galaxies using MIRI F2100W data. We identify sources using a dendrogram-based algorithm, and we measure the background-subtracted flux densities for JWST bands from 2 to 21μm. Using the spectral energy distribution (SED) in JWST and HST bands plus ALMA and MUSE/VLT observations, we classify the sources by eye. Then we use this classification to define regions in color–color space and so establish a quantitative framework for classifying sources. We identify 1085 sources as belonging to the ISM of the target galaxies with the remainder being dusty stars or background galaxies. These 21μm sources are strongly spatially associated with Hiiregions (>92% of sources), while 74% of the sources are coincident with a stellar association defined in the HST data. Using SED fitting, we find that the stellar masses of the 21μm sources span a range of 102–104Mwith mass-weighted ages down to 2 Myr. There is a tight correlation between attenuation-corrected Hαand 21μm luminosity forLν,F2100W> 1019W Hz−1. Young embedded source candidates selected at 21μm are found below this threshold and haveM< 103M.

     
    more » « less
  8. Abstract We compare mid-infrared (mid-IR), extinction-corrected H α , and CO (2–1) emission at 70–160 pc resolution in the first four PHANGS–JWST targets. We report correlation strengths, intensity ratios, and power-law fits relating emission in JWST’s F770W, F1000W, F1130W, and F2100W bands to CO and H α . At these scales, CO and H α each correlate strongly with mid-IR emission, and these correlations are each stronger than the one relating CO to H α emission. This reflects that mid-IR emission simultaneously acts as a dust column density tracer, leading to a good match with the molecular-gas-tracing CO, and as a heating tracer, leading to a good match with the H α . By combining mid-IR, CO, and H α at scales where the overall correlation between cold gas and star formation begins to break down, we are able to separate these two effects. We model the mid-IR above I ν = 0.5 MJy sr −1 at F770W, a cut designed to select regions where the molecular gas dominates the interstellar medium (ISM) mass. This bright emission can be described to first order by a model that combines a CO-tracing component and an H α -tracing component. The best-fitting models imply that ∼50% of the mid-IR flux arises from molecular gas heated by the diffuse interstellar radiation field, with the remaining ∼50% associated with bright, dusty star-forming regions. We discuss differences between the F770W, F1000W, and F1130W bands and the continuum-dominated F2100W band and suggest next steps for using the mid-IR as an ISM tracer. 
    more » « less
  9. Abstract

    We present maps tracing the fraction of dust in the form of polycyclic aromatic hydrocarbons (PAHs) in IC 5332, NGC 628, NGC 1365, and NGC 7496 from JWST/MIRI observations. We trace the PAH fraction by combining the F770W (7.7μm) and F1130W (11.3μm) filters to track ionized and neutral PAH emission, respectively, and comparing the PAH emission to F2100W, which traces small, hot dust grains. We find the averageRPAH= (F770W + F1130W)/F2100W values of 3.3, 4.7, 5.1, and 3.6 in IC 5332, NGC 628, NGC 1365, and NGC 7496, respectively. We find that Hiiregions traced by MUSE Hαshow a systematically low PAH fraction. The PAH fraction remains relatively constant across other galactic environments, with slight variations. We use CO+Hi+Hαto trace the interstellar gas phase and find that the PAH fraction decreases above a value ofIHα/ΣHI+H21037.5ergs1kpc2(Mpc2)1in all four galaxies. Radial profiles also show a decreasing PAH fraction with increasing radius, correlated with lower metallicity, in line with previous results showing a strong metallicity dependence to the PAH fraction. Our results suggest that the process of PAH destruction in ionized gas operates similarly across the four targets.

     
    more » « less
  10. ABSTRACT

    Connecting the gas in H ii regions to the underlying source of the ionizing radiation can help us constrain the physical processes of stellar feedback and how H ii regions evolve over time. With PHANGS–MUSE, we detect nearly 24 000 H ii regions across 19 galaxies and measure the physical properties of the ionized gas (e.g. metallicity, ionization parameter, and density). We use catalogues of multiscale stellar associations from PHANGS–HST to obtain constraints on the age of the ionizing sources. We construct a matched catalogue of 4177 H ii regions that are clearly linked to a single ionizing association. A weak anticorrelation is observed between the association ages and the $\mathrm{H}\, \alpha$ equivalent width $\mathrm{EW}(\mathrm{H}\, \alpha)$, the $\mathrm{H}\, \alpha/\mathrm{FUV}$ flux ratio, and the ionization parameter, log q. As all three are expected to decrease as the stellar population ages, this could indicate that we observe an evolutionary sequence. This interpretation is further supported by correlations between all three properties. Interpreting these as evolutionary tracers, we find younger nebulae to be more attenuated by dust and closer to giant molecular clouds, in line with recent models of feedback-regulated star formation. We also observe strong correlations with the local metallicity variations and all three proposed age tracers, suggestive of star formation preferentially occurring in locations of locally enhanced metallicity. Overall, $\mathrm{EW}(\mathrm{H}\, \alpha)$ and log q show the most consistent trends and appear to be most reliable tracers for the age of an H ii region.

     
    more » « less