<?xml version="1.0" encoding="UTF-8"?><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcq="http://purl.org/dc/terms/"><records count="1" morepages="false" start="1" end="1"><record rownumber="1"><dc:product_type>Conference Paper</dc:product_type><dc:title>A Fast and Accurate One-Stage Approach to Visual Grounding</dc:title><dc:creator>Yang, Z; Gong, B; Wang, L; Huang, W; Yu, D; Luo, J.</dc:creator><dc:corporate_author/><dc:editor/><dc:description>We propose a simple, fast, and accurate one-stage approach to visual grounding, inspired by the following insight. The performances of existing propose-and-rank twostage methods are capped by the quality of the region candidates they propose in the first stage — if none of the candidates could cover the ground truth region, there is no hope
in the second stage to rank the right region to the top. To
avoid this caveat, we propose a one-stage model that enables end-to-end joint optimization. The main idea is as
straightforward as fusing a text query’s embedding into the
YOLOv3 object detector, augmented by spatial features so
as to account for spatial mentions in the query. Despite being simple, this one-stage approach shows great potential
in terms of both accuracy and speed for both phrase localization and referring expression comprehension, according
to our experiments. Given these results along with careful
investigations into some popular region proposals, we advocate for visual grounding a paradigm shift from the conventional two-stage methods to the one-stage framework.</dc:description><dc:publisher/><dc:date>2019-10-01</dc:date><dc:nsf_par_id>10169166</dc:nsf_par_id><dc:journal_name>International Conference on Computer Vision</dc:journal_name><dc:journal_volume/><dc:journal_issue/><dc:page_range_or_elocation/><dc:issn/><dc:isbn/><dc:doi>https://doi.org/10.1109/ICCV.2019.00478</dc:doi><dcq:identifierAwardId>1813709; 1722847</dcq:identifierAwardId><dc:subject/><dc:version_number/><dc:location/><dc:rights/><dc:institution/><dc:sponsoring_org>National Science Foundation</dc:sponsoring_org></record></records></rdf:RDF>