<?xml version="1.0" encoding="UTF-8"?><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcq="http://purl.org/dc/terms/"><records count="1" morepages="false" start="1" end="1"><record rownumber="1"><dc:product_type>Journal Article</dc:product_type><dc:title>Demonstration of accelerating machine learning inference queries with correlative proxy models</dc:title><dc:creator>Yang, Zhihui; Huang, Yicong; Wang, Zuozhi; Gao, Feng; Lu, Yao; Li, Chen; Wang, X. Sean</dc:creator><dc:corporate_author/><dc:editor/><dc:description>We will demonstrate a prototype query-processing engine, which utilizes correlations among predicates to accelerate machine learning (ML) inference queries on unstructured data. Expensive operators such as feature extractors and classifiers are deployed as user-defined functions (UDFs), which are not penetrable by classic query optimization techniques such as predicate push-down. Recent optimization schemes (e.g., Probabilistic Predicates or PP) build a cheap proxy model for each predicate offline, and inject proxy models in the front of expensive ML UDFs under the independence assumption in queries. Input records that do not satisfy query predicates are filtered early by proxy models to bypass ML UDFs. But enforcing the independence assumption may result in sub-optimal plans. We use correlative proxy models to better exploit predicate correlations and accelerate ML queries. We will demonstrate our query optimizer called CORE, which builds proxy models online, allocates parameters to each model, and reorders them. We will also show end-to-end query processing with or without proxy models.</dc:description><dc:publisher/><dc:date>2022-08-01</dc:date><dc:nsf_par_id>10442815</dc:nsf_par_id><dc:journal_name>Proceedings of the VLDB Endowment</dc:journal_name><dc:journal_volume>15</dc:journal_volume><dc:journal_issue>12</dc:journal_issue><dc:page_range_or_elocation>3734 to 3737</dc:page_range_or_elocation><dc:issn>2150-8097</dc:issn><dc:isbn/><dc:doi>https://doi.org/10.14778/3554821.3554887</dc:doi><dcq:identifierAwardId>2107150</dcq:identifierAwardId><dc:subject/><dc:version_number/><dc:location/><dc:rights/><dc:institution/><dc:sponsoring_org>National Science Foundation</dc:sponsoring_org></record></records></rdf:RDF>