<?xml version="1.0" encoding="UTF-8"?><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcq="http://purl.org/dc/terms/"><records count="1" morepages="false" start="1" end="1"><record rownumber="1"><dc:product_type>Journal Article</dc:product_type><dc:title>Event-based dual photography for transparent scene reconstruction</dc:title><dc:creator>Liu, Xiaomeng; Rego, Joshua D.; Jayasuriya, Suren; Koppal, Sanjeev J.</dc:creator><dc:corporate_author/><dc:editor/><dc:description>Light transport contains all light information between a light source and an image sensor. As an important application of light transport, dual photography has been a popular research topic, but it is challenged by long acquisition time, low signal-to-noise ratio, and the storage or processing of a large number of measurements. In this Letter, we propose a novel hardware setup that combines a flying-spot micro-electro mechanical system (MEMS) modulated projector with an event camera to implement dual photography for 3D scanning in both line-of-sight (LoS) and non-line-of-sight (NLoS) scenes with a transparent object. In particular, we achieved depth extraction from the LoS scenes and 3D reconstruction of the object in a NLoS scene using event light transport.</dc:description><dc:publisher/><dc:date>2023-01-01</dc:date><dc:nsf_par_id>10433523</dc:nsf_par_id><dc:journal_name>Optics Letters</dc:journal_name><dc:journal_volume>48</dc:journal_volume><dc:journal_issue>5</dc:journal_issue><dc:page_range_or_elocation>1304</dc:page_range_or_elocation><dc:issn>0146-9592</dc:issn><dc:isbn/><dc:doi>https://doi.org/10.1364/OL.483047</dc:doi><dcq:identifierAwardId>1909192</dcq:identifierAwardId><dc:subject/><dc:version_number/><dc:location/><dc:rights/><dc:institution/><dc:sponsoring_org>National Science Foundation</dc:sponsoring_org></record></records></rdf:RDF>