skip to main content

This content will become publicly available on April 25, 2023

Title: Representing Mixtures of Word Embeddings with Mixtures of Topic Embeddings
; ; ; ; ; ;
Award ID(s):
Publication Date:
Journal Name:
International Conference on Learning Representations
Sponsoring Org:
National Science Foundation
More Like this
  1. Cross-lingual language tasks typically require a substantial amount of annotated data or parallel translation data. We explore whether language representations that capture relationships among languages can be learned and subsequently leveraged in cross-lingual tasks without the use of parallel data. We generate dense embeddings for 29 languages using a denoising autoencoder, and evaluate the embeddings using the World Atlas of Language Structures (WALS) and two extrinsic tasks in a zero-shot setting: cross-lingual dependency parsing and cross-lingual natural language inference.