Weak Adaptation Learning: Addressing Cross-domain Data Insufficiency with Weak Annotator
More Like this
-
The weak gravity conjecture holds that in a theory of quantum gravity any gauge force must mediate interactions stronger than gravity for some particles. This statement has surprisingly deep and extensive connections to many different areas of physics and mathematics. Several variations on the basic conjecture have been proposed, including statements that are much stronger but are nonetheless satisfied by all known consistent quantum gravity theories. These related conjectures and the evidence for their validity in the string theory landscape are reviewed. Also reviewed are a variety of arguments for these conjectures, which tend to fall into two categories: qualitative arguments that claim the conjecture is plausible based on general principles and quantitative arguments for various special cases or analogs of the conjecture. The implications of these conjectures for particle physics, cosmology, general relativity, and mathematics are also outlined. Finally, important directions for future research are highlighted.more » « less
-
Weak supervision (WS) frameworks are a popular way to bypass hand-labeling large datasets for training data-hungry models. These approaches synthesize multiple noisy but cheaply-acquired estimates of labels into a set of high-quality pseudo-labels for downstream training. However, the synthesis technique is specific to a particular kind of label, such as binary labels or sequences, and each new label type requires manually designing a new synthesis algorithm. Instead, we propose a universal technique that enables weak supervision over any label type while still offering desirable properties, including practical flexibility, computational efficiency, and theoretical guarantees. We apply this technique to important problems previously not tackled by WS frameworks including learning to rank, regression, and learning in hyperbolic space. Theoretically, our synthesis approach produces a consistent estimators for learning some challenging but important generalizations of the exponential family model. Experimentally, we validate our framework and show improvement over baselines in diverse settings including real-world learning-to-rank and regression problems along with learning on hyperbolic manifolds.more » « less
An official website of the United States government

