Low-Latency In-Band Integration of Multiple Low-Power Wide-Area Networks
- Award ID(s):
- 2006467
- Publication Date:
- NSF-PAR ID:
- 10296347
- Journal Name:
- 2021 IEEE 27th Real-Time and Embedded Technology and Applications Symposium (RTAS)
- Page Range or eLocation-ID:
- 333 to 346
- Sponsoring Org:
- National Science Foundation
More Like this
-
This paper describes a systematic study of an approach to Farsi-Spanish low-resource Neural Machine Translation (NMT) that leverages monolingual data for joint learning of forward and backward translation models. As is standard for NMT systems, the training process begins using two pre-trained translation models that are iteratively updated by decreasing translation costs. In each iteration, either translation model is used to translate monolingual texts from one language to another, to generate synthetic datasets for the other translation model. Two new translation models are then learned from bilingual data along with the synthetic texts. The key distinguishing feature between our approach and standard NMT is an iterative learning process that improves the performance of both translation models, simultaneously producing a higher-quality synthetic training dataset upon each iteration. Our empirical results demonstrate that this approach outperforms baselines.