skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 11:00 PM ET on Thursday, June 13 until 2:00 AM ET on Friday, June 14 due to maintenance. We apologize for the inconvenience.


Title: An Analysis of the Partnership between Retailers and Low-credibility News Publishers
In this paper, we provide a large-scale analysis of the display ad ecosystem that supports low-credibility and traditional news sites, with a particular focus on the relationship between retailers and news producers. We study this relationship from both the retailer and news producer perspectives. First, focusing on the retailers, our work reveals high-profile retailers that are frequently advertised on low-credibility news sites, including those that are more likely to be advertised on low-credibility news sites than traditional news sites. Additionally, despite high-profile retailers having more resources and incentive to dissociate with low-credibility news publishers, we surprisingly do not observe a strong relationship between retailer popularity and advertising intensity on low-credibility news sites. We also do not observe a significant difference across different market sectors. Second, turning to the publishers, we characterize how different retailers are contributing to the ad revenue stream of low-credibility news sites. We observe that retailers who are among the top-10K websites on the Internet account for a quarter of all ad traffic on low-credibility news sites. Nevertheless, we show that low-credibility news sites are already becoming less reliant on popular retailers over time, highlighting the dynamic nature of the low-credibility news ad ecosystem.  more » « less
Award ID(s):
1934925 1934494
NSF-PAR ID:
10279599
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Journal of Quantitative Description: Digital Media
Volume:
1
ISSN:
2673-8813
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. In studies of misinformation, the distinction between high- and low-credibility publishers is fundamental. However, there is much that we do not know about the relationship between the subject matter and timing of content produced by the two types of publishers. By analyzing the content of several million unique articles published over 28 months, we show that high- and low-credibility publishers operate in distinct news ecosystems. Bursts of news coverage generated by the two types of publishers tend to cover different subject matter at different times, even though fluctuations in their overall news production tend to be highly correlated. Regardless of the mechanism, temporally convergent coverage among low-credibility publishers has troubling implications for American news consumers.

     
    more » « less
  2. null (Ed.)
    Monetizing websites and web apps through online advertising is widespread in the web ecosystem, creating a billion-dollar market. This has led to the emergence of a vast network of tertiary ad providers and ad syndication to facilitate this growing market. Nowadays, the online advertising ecosystem forces publishers to integrate ads from these third-party domains. On the one hand, this raises several privacy and security concerns that are actively being studied in recent years. On the other hand, the ability of today's browsers to load dynamic web pages with complex animations and Javascript has also transformed online advertising. This can have a significant impact on webpage performance. The latter is a critical metric for optimization since it ultimately impacts user satisfaction. Unfortunately, there are limited literature studies on understanding the performance impacts of online advertising which we argue is as important as privacy and security. In this paper, we apply an in-depth and first-of-a-kind performance evaluation of web ads. Unlike prior efforts that rely primarily on adblockers, we perform a fine-grained analysis on the web browser's page loading process to demystify the performance cost of web ads. We aim to characterize the cost by every component of an ad, so the publisher, ad syndicate, and advertiser can improve the ad's performance with detailed guidance. For this purpose, we develop a tool, adPerf, for the Chrome browser that classifies page loading workloads into ad-related and main-content at the granularity of browser activities. Our evaluations show that online advertising entails more than 15% of browser page loading workload and approximately 88% of that is spent on JavaScript. On smartphones, this additional cost of ads is 7% lower since mobile pages include fewer and well-optimized ads. We also track the sources and delivery chain of web ads and analyze performance considering the origin of the ad contents. We observe that 2 of the well-known third-party ad domains contribute to 35% of the ads performance cost and surprisingly, top news websites implicitly include unknown third-party ads which in some cases build up to more than 37% of the ads performance cost. 
    more » « less
  3. De Cristofaro, Emiliano ; Nakov, Preslav (Ed.)
    Google’s reviewed claims feature was an early attempt to incorporate additional credibility signals from fact-checking onto the search results page. The feature, which appeared when users searched for the name of a subset of news publishers, was criticized by dozens of publishers for its errors and alleged anticonservative bias. By conducting an audit of news publisher search results and focusing on the critiques of publishers, we find that there is a lack of consensus between fact-checking ecosystem stakeholders that may be important to address in future iterations of public facing fact-checking tools. In particular, we find that a lack of transparency coupled with a lack of consensus on what makes a fact-check relevant to a news article led to the breakdown of reviewed claims. 
    more » « less
  4. In an increasingly information-dense web, how do we ensure that we do not fall for unreliable information? To design better web literacy practices for assessing online information, we need to understand how people perceive the credibility of unfamiliar websites under time constraints. Would they be able to rate real news websites as more credible and fake news websites as less credible? We investigated this research question through an experimental study with 42 participants (mean age = 28.3) who were asked to rate the credibility of various “real news” (n = 14) and “fake news” (n = 14) websites under different time conditions (6s, 12s, 20s), and with a different advertising treatment (with or without ads). Participants did not visit the websites to make their credibility assessments; instead, they interacted with the images of website screen captures, which were modified to remove any mention of website names, to avoid the effect of name recognition. Participants rated the credibility of each website on a scale from 1 to 7 and in follow-up interviews provided justifications for their credibility scores. Through hypothesis testing, we find that participants, despite limited time exposure to each website (between 6 and 20 seconds), are quite good at the task of distinguishing between real and fake news websites, with real news websites being overall rated as more credible than fake news websites. Our results agree with the well-known theory of “first impressions” from psychology, that has established the human ability to infer character traits from faces. That is, participants can quickly infer meaningful visual and content cues from a website, that are helping them make the right credibility evaluation decision. 
    more » « less
  5. This paper studies an inventory management problem faced by an upstream supplier that is in a collaborative agreement, such as vendor-managed inventory (VMI), with a retailer. A VMI partnership provides the supplier an opportunity to manage in- ventory for the supply chain in exchange for point-of-sales (POS)- and inventory-level information from the retailer. However, retailers typically possess superior local market information and as has been the case in recent years, are able to capture and analyze customer purchasing behavior beyond the traditional POS data. Such analyses provide the retailer access to market signals that are otherwise hard to capture using POS information. We show and quantify the implication of the financial obligations of each party in VMI that renders communication of such important market signals as noncredible. To help insti- tute a sound VMI collaboration, we propose learn and screen—a dynamic inventory mechanism—for the supplier to effectively manage inventory and information in the supply chain. The proposed mechanism combines the ability of the supplier to learn about market conditions from POS data (over multiple selling periods) and dynamically de- termine when to screen the retailer and acquire his private demand information. Inventory decisions in the proposed mechanism serve a strategic purpose in addition to their classic role of satisfying customer demand. We show that our proposed dynamic mechanism significantly improves the supplier’s expected profit and increases the efficiency of the overall supply chain operations under a VMI agreement. In addition, we determine the market conditions in which a strategic approach to VMI results in significant profit im- provements for both firms, particularly when the retailer has high market power (i.e., when the supplier highly depends on the retailer) and when the supplier has relatively less knowledge about the end customer/market compared with the retailer. 
    more » « less