<?xml version="1.0" encoding="UTF-8"?><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcq="http://purl.org/dc/terms/"><records count="1" morepages="false" start="1" end="1"><record rownumber="1"><dc:product_type>Conference Paper</dc:product_type><dc:title>Rango: Adaptive Retrieval-Augmented Proving for Automated Software Verification</dc:title><dc:creator>Thompson, Kyle; Saanvedra, Nuno; Carrott, Pedro; Fisher, Kevin; Sanchez-Stern, Alex; Brun, Yuriy; Ferreira, Joao F; Lerner, Sorin; First, Emily</dc:creator><dc:corporate_author/><dc:editor/><dc:description>Formal verification using proof assistants, such as
Coq, enables the creation of high-quality software. However,
the verification process requires significant expertise and manual
effort to write proofs. Recent work has explored automating
proof synthesis using machine learning and large language
models (LLMs). This work has shown that identifying relevant
premises, such as lemmas and definitions, can aid synthesis. We
present Rango, a fully automated proof synthesis tool for Coq
that automatically identifies relevant premises and also similar
proofs from the current project and uses them during synthesis.
Rango uses retrieval augmentation at every step of the proof to
automatically determine which proofs and premises to include
in the context of its fine-tuned LLM. In this way, Rango adapts
to the project and to the evolving state of the proof. We create
a new dataset, CoqStoq, of 2,226 open-source Coq projects and
196,929 theorems from GitHub, which includes both training data
and a curated evaluation benchmark of well-maintained projects.
On this benchmark, Rango synthesizes proofs for 32.0% of the
theorems, which is 29% more theorems than the prior state-of-the-
art tool Tactician. Our evaluation also shows that Rango
adding relevant proofs to its context leads to a 47% increase in
the number of theorems proven.</dc:description><dc:publisher>Proceedings of the IEEE/ACM International Conference on Software Engineering</dc:publisher><dc:date>2025-04-28</dc:date><dc:nsf_par_id>10662952</dc:nsf_par_id><dc:journal_name/><dc:journal_volume/><dc:journal_issue/><dc:page_range_or_elocation/><dc:issn/><dc:isbn/><dc:doi>https://doi.org/</dc:doi><dcq:identifierAwardId>1955457; 2220892</dcq:identifierAwardId><dc:subject/><dc:version_number/><dc:location/><dc:rights/><dc:institution/><dc:sponsoring_org>National Science Foundation</dc:sponsoring_org></record></records></rdf:RDF>