<?xml version="1.0" encoding="UTF-8"?><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcq="http://purl.org/dc/terms/"><records count="1" morepages="false" start="1" end="1"><record rownumber="1"><dc:product_type>Conference Paper</dc:product_type><dc:title>Excerpt of Auritus: An Open-Source Optimization Toolkit for Training and Development of Human Movement Models and Filters Using Earables</dc:title><dc:creator>Saha, Swapnil Sayan; Sandha, Sandeep Singh; Pei, Siyou; Jain, Vivek; Wang, Ziqi; Li, Yuchen; Sarker, Ankur; Srivastava, Mani</dc:creator><dc:corporate_author/><dc:editor/><dc:description>Auritus is an extendable and open-source optimization toolkit designed to enhance and replicate earable applications. Auritus serves two primary functions. Firstly, Auritus handles data collection, pre-processing, and labeling tasks for creating customized earable datasets using graphical tools. The system includes an open-source dataset with 2.43 million inertial samples related to head and full-body movements, consisting of 34 head poses and 9 activities from 45 volunteers. Secondly, Auritus provides a tightly-integrated hardware-in-the-loop (HIL) optimizer and TinyML interface to develop lightweight and real-time machine-learning (ML) models for activity detection and filters for head-pose tracking. Auritus recognizes activities with 91% leave 1-out test accuracy (98% test accuracy) using real-time models as small as 6-13 kB. Our models are 98-740 × smaller and 3-6% more accurate over the state-of-the-art. We also estimate head pose with absolute errors as low as 5 degrees using 20kB filters, achieving up to 1.6 × precision improvement over existing techniques. Auritus is available at https://github.com/nesl/auritus.</dc:description><dc:publisher/><dc:date>2022-09-11</dc:date><dc:nsf_par_id>10411940</dc:nsf_par_id><dc:journal_name>Excerpt of Auritus: An Open-Source Optimization Toolkit for Training and Development of Human Movement Models and Filters Using Earables</dc:journal_name><dc:journal_volume/><dc:journal_issue/><dc:page_range_or_elocation>252 to 253</dc:page_range_or_elocation><dc:issn/><dc:isbn/><dc:doi>https://doi.org/10.1145/3544793.3563423</dc:doi><dcq:identifierAwardId>1822935</dcq:identifierAwardId><dc:subject/><dc:version_number/><dc:location/><dc:rights/><dc:institution/><dc:sponsoring_org>National Science Foundation</dc:sponsoring_org></record></records></rdf:RDF>