skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: CWF: Consolidating Weak Features in High-quality Mesh Simplification
In mesh simplification, common requirements like accuracy, triangle quality, and feature alignment are often considered as a trade-off. Existing algorithms concentrate on just one or a few specific aspects of these requirements. For example, the well-known Quadric Error Metrics (QEM) approach [Garland and Heckbert 1997] prioritizes accuracy and can preserve strong feature lines/points as well, but falls short in ensuring high triangle quality and may degrade weak features that are not as distinctive as strong ones. In this paper, we propose a smooth functional that simultaneously considers all of these requirements. The functional comprises a normal anisotropy term and a Centroidal Voronoi Tessellation (CVT) [Du et al. 1999] energy term, with the variables being a set of movable points lying on the surface. The former inherits the spirit of QEM but operates in a continuous setting, while the latter encourages even point distribution, allowing various surface metrics. We further introduce a decaying weight to automatically balance the two terms. We selected 100 CAD models from the ABC dataset [Koch et al. 2019], along with 21 organic models, to compare the existing mesh simplification algorithms with ours. Experimental results reveal an important observation: the introduction of a decaying weight effectively reduces the conflict between the two terms and enables the alignment of weak features. This distinctive feature sets our approach apart from most existing mesh simplification methods and demonstrates significant potential in shape understanding. Please refer to the teaser figure for illustration.  more » « less
Award ID(s):
2007661
PAR ID:
10558847
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ;
Publisher / Repository:
Association for Computing Machinery
Date Published:
Journal Name:
ACM Transactions on Graphics
Volume:
43
Issue:
4
ISSN:
0730-0301
Page Range / eLocation ID:
1 to 14
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. There are numerous large-scale applications requiring mesh adaptivity, e.g., computational fluid dynamics and weather prediction. Parallel processing is needed for simulations involving large-scale adaptive meshes. In this paper, we propose a parallel variational mesh quality improvement algorithm for use with distributed memory machines. Our method parallelizes the serial variational mesh quality improvement method by Huang and Kamenski. Their approach is based on the use of the Moving Mesh PDE method to adapt the mesh based on the minimization of an energy functional for mesh equidistribution and alignment. This leads to a system of ordinary differential equations (ODEs) to be solved which determine where to move the interior mesh nodes. An efficient solution is obtained by solving the ODEs on subregions of the mesh with overlapped communication and computation. Strong and weak scaling experiments on up to 128 cores for meshes with up to 160M elements demonstrate excellent results. 
    more » « less
  2. A conditional sampling oracle for a probability distribution D returns samples from the conditional distribution of D restricted to a specified subset of the domain. A recent line of work (Chakraborty et al. 2013 and Cannone et al. 2014) has shown that having access to such a conditional sampling oracle requires only polylogarithmic or even constant number of samples to solve distribution testing problems like identity and uniformity. This significantly improves over the standard sampling model where polynomially many samples are necessary. Inspired by these results, we introduce a computational model based on conditional sampling to develop sublinear algorithms with exponentially faster runtimes compared to standard sublinear algorithms. We focus on geometric optimization problems over points in high dimensional Euclidean space. Access to these points is provided via a conditional sampling oracle that takes as input a succinct representation of a subset of the domain and outputs a uniformly random point in that subset. We study two well studied problems: k-means clustering and estimating the weight of the minimum spanning tree. In contrast to prior algorithms for the classic model, our algorithms have time, space and sample complexity that is polynomial in the dimension and polylogarithmic in the number of points. Finally, we comment on the applicability of the model and compare with existing ones like streaming, parallel and distributed computational models. 
    more » « less
  3. Rothblum, Guy N (Ed.)
    We study the problem of auditing classifiers for statistical subgroup fairness. Kearns et al. [Kearns et al., 2018] showed that the problem of auditing combinatorial subgroups fairness is as hard as agnostic learning. Essentially all work on remedying statistical measures of discrimination against subgroups assumes access to an oracle for this problem, despite the fact that no efficient algorithms are known for it. If we assume the data distribution is Gaussian, or even merely log-concave, then a recent line of work has discovered efficient agnostic learning algorithms for halfspaces. Unfortunately, the reduction of Kearns et al. was formulated in terms of weak, "distribution-free" learning, and thus did not establish a connection for families such as log-concave distributions. In this work, we give positive and negative results on auditing for Gaussian distributions: On the positive side, we present an alternative approach to leverage these advances in agnostic learning and thereby obtain the first polynomial-time approximation scheme (PTAS) for auditing nontrivial combinatorial subgroup fairness: we show how to audit statistical notions of fairness over homogeneous halfspace subgroups when the features are Gaussian. On the negative side, we find that under cryptographic assumptions, no polynomial-time algorithm can guarantee any nontrivial auditing, even under Gaussian feature distributions, for general halfspace subgroups. 
    more » « less
  4. Root-root interactions alter the architectural organization of individual root systems and therefore, affect nutrient foraging (O’Brien et al., 2005). Past reports have shown detrimental and beneficial effects to the amount of yield in crops as they avoid or prefer belowground competition (Li et al., 2006; O’Brien et al., 2005). With little research done into root-root interactions there is still much to discover about the root phenotypes arising from root-root interactions and functions. Quantifying architectural traits of root system interactions would provide insight for researchers into the benefit of a cooperation vs. competition trade-off belowground. We have begun to develop a soil filled mesocosm system to perform a series of preliminary studies using 3D imaging to develop metrics of root-root interaction using common beans (Phaseolus vulgaris). Common beans have a relatively fast growing and sparse adventitious and basal root system, making them a suitable organism for this imaging study. Our second revision of the mesocosm focused on improving and fine tuning a mesh system that provides better support for the root architecture during the soil removal process. We use a light-weight, low-visibility plastic mesh originally used as bird netting to allow image capture from all sides. Traits that we aim to extract include root growth angle, rooting depth, and root volume relative to neighbors, because these spatial qualities determine the soil areas that the root system will be foraging in. Our data will allow for the quantification and association of root plasticity in the presence of belowground competition. 
    more » « less
  5. null (Ed.)
    The unsigned p-Willmore functional introduced in the work of Mondino [2011] generalizes important geometric functionals, which measure the area and Willmore energy of immersed surfaces. Presently, techniques from the work of Dziuk [2008] are adapted to compute the first variation of this functional as a weak-form system of equations, which are subsequently used to develop a model for the p-Willmore flow of closed surfaces in R 3 . This model is amenable to constraints on surface area and enclosed volume and is shown to decrease the p-Willmore energy monotonically. In addition, a penalty-based regularization procedure is formulated to prevent artificial mesh degeneration along the flow; inspired by a conformality condition derived in the work of Kamberov et al. [1996], this procedure encourages angle-preservation in a closed and oriented surface immersion as it evolves. Following this, a finite-element discretization of both procedures is discussed, an algorithm for running the flow is given, and an application to mesh editing is presented. 
    more » « less