<?xml version="1.0" encoding="UTF-8"?><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcq="http://purl.org/dc/terms/"><records count="1" morepages="false" start="1" end="1"><record rownumber="1"><dc:product_type>Conference Paper</dc:product_type><dc:title>CorrGAN: Simultaneous Learning of Speech Enhancement and Perceptual Quality Loss Functions</dc:title><dc:creator>Zadorozhnyy, Vasily; Amizadeh, Saeed; Ye, Qiang; Koishida, Kazuhito</dc:creator><dc:corporate_author/><dc:editor/><dc:description>Deep-learning models have allowed effective end-to-end SE systems in the Speech Enhancement (SE) field. Most of these methods are trained using a fixed reconstruction loss in a supervised setting. Often these losses do not perfectly represent the desired perceptual quality metrics, resulting in sub-optimal performance. Recently, there have been efforts to learn the behavior of those metrics directly via neural nets for training SE models. However, an accurate estimation of the true metric function introduces statistical complexity for training because it attempts to capture the exact value of the metric. We propose an adversarial training strategy based on statistical correlation that avoids the complexity of estimating the SE metric while learning to mimic its overall behavior. We call this framework CorrGAN and show its significant improvement over standard losses of the SOTA baselines and achieve SOTA performance on the VoiceBank+DEMAND dataset.</dc:description><dc:publisher>IEEE</dc:publisher><dc:date>2025-04-06</dc:date><dc:nsf_par_id>10616198</dc:nsf_par_id><dc:journal_name/><dc:journal_volume/><dc:journal_issue/><dc:page_range_or_elocation>1 to 5</dc:page_range_or_elocation><dc:issn/><dc:isbn>979-8-3503-6874-1</dc:isbn><dc:doi>https://doi.org/10.1109/ICASSP49660.2025.10887633</dc:doi><dcq:identifierAwardId>2208314; 2327113; 2433190</dcq:identifierAwardId><dc:subject/><dc:version_number/><dc:location>Hyderabad, India</dc:location><dc:rights/><dc:institution/><dc:sponsoring_org>National Science Foundation</dc:sponsoring_org></record></records></rdf:RDF>