Memory and truth: correcting errors with true feedback versus overwriting correct answers with errors
- Award ID(s):
- 1824193
- PAR ID:
- 10309371
- Date Published:
- Journal Name:
- Cognitive Research: Principles and Implications
- Volume:
- 4
- Issue:
- 1
- ISSN:
- 2365-7464
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
In the Correlation Clustering problem, we are given a weighted graph $$G$$ with its edges labelled as "similar" or "dissimilar" by a binary classifier. The goal is to produce a clustering that minimizes the weight of "disagreements": the sum of the weights of "similar" edges across clusters and "dissimilar" edges within clusters. We study the correlation clustering problem under the following assumption: Every "similar" edge $$e$$ has weight $$w_e \in [ \alpha w, w ]$$ and every "dissimilar" edge $$e$$ has weight $$w_e \geq \alpha w$$ (where $$\alpha \leq 1$$ and $w > 0$ is a scaling parameter). We give a $$(3 + 2 \log_e (1/\alpha))$$ approximation algorithm for this problem. This assumption captures well the scenario when classification errors are asymmetric. Additionally, we show an asymptotically matching Linear Programming integrality gap of $$\Omega(\log 1/\alpha)$$more » « less
-
The present paper studies density deconvolution in the presence of small Berkson errors, in particular, when the variances of the errors tend to zero as the sample size grows. It is known that when the Berkson errors are present, in some cases, the unknown density estimator can be obtained by simple averaging without using kernels. However, this may not be the case when Berkson errors are asymptotically small. By treating the former case as a kernel estimator with the zero bandwidth, we obtain the optimal expressions for the bandwidth.We show that the density of Berkson errors acts as a regularizer, so that the kernel estimator is unnecessary when the variance of Berkson errors lies above some threshold that depends on the shapes of the densities in the model and the number of observations.more » « less
-
Morin, P; Suri, S (Ed.)We provide several algorithms for sorting an array of n comparable distinct elements subject to probabilistic comparison errors in external memory. In this model, which has been extensively studied in internal-memory settings, the comparison of two elements returns the wrong answer according to a fixed probability, p<1/2,, and otherwise returns the correct answer. The dislocation of an element is the distance between its position in a given (current or output) array and its position in a sorted array. There are various existing algorithms that can be utilized for sorting or near-sorting elements subject to probabilistic comparison errors, but these algorithms do not translate into efficient external-memory algorithms, because they all make heavy use of noisy binary searching. In this paper, we provide new efficient methods that are in the external-memory model for sorting with comparison errors. Our algorithms achieve an optimal number of I/Os, in both cache-aware and cache-oblivious settings.more » « less