<?xml version="1.0" encoding="UTF-8"?><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcq="http://purl.org/dc/terms/"><records count="1" morepages="false" start="1" end="1"><record rownumber="1"><dc:product_type>Conference Paper</dc:product_type><dc:title>Multi-source Domain Adaptation for Semantic Segmentation”, Advances in Neural Information Processing Systems</dc:title><dc:creator>Zhao, S.; Li, B.; Yue, X.; Gu, Y.; Xu, P.; Hu, R.; Chai, H.; Keutzer, K.</dc:creator><dc:corporate_author/><dc:editor/><dc:description>Simulation-to-real domain adaptation for semantic segmentation has been actively
studied for various applications such as autonomous driving. Existing methods
mainly focus on a single-source setting, which cannot easily handle a more practical
scenario of multiple sources with different distributions. In this paper, we propose
to investigate multi-source domain adaptation for semantic segmentation. Specifically,
we design a novel framework, termed Multi-source Adversarial Domain
Aggregation Network (MADAN), which can be trained in an end-to-end manner.
First, we generate an adapted domain for each source with dynamic semantic
consistency while aligning at the pixel-level cycle-consistently towards the target.
Second, we propose sub-domain aggregation discriminator and cross-domain cycle
discriminator to make different adapted domains more closely aggregated. Finally,
feature-level alignment is performed between the aggregated domain and target
domain while training the segmentation network. Extensive experiments from
synthetic GTA and SYNTHIA to real Cityscapes and BDDS datasets demonstrate
that the proposed MADAN model outperforms state-of-the-art approaches. Our
source code is released at: https://github.com/Luodian/MADAN.</dc:description><dc:publisher/><dc:date>2019-10-15</dc:date><dc:nsf_par_id>10197948</dc:nsf_par_id><dc:journal_name>Advances in neural information processing systems</dc:journal_name><dc:journal_volume/><dc:journal_issue/><dc:page_range_or_elocation>7287--7300</dc:page_range_or_elocation><dc:issn>1049-5258</dc:issn><dc:isbn/><dc:doi>https://doi.org/</dc:doi><dcq:identifierAwardId>1645964</dcq:identifierAwardId><dc:subject/><dc:version_number/><dc:location/><dc:rights/><dc:institution/><dc:sponsoring_org>National Science Foundation</dc:sponsoring_org></record></records></rdf:RDF>