<?xml version="1.0" encoding="UTF-8"?><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcq="http://purl.org/dc/terms/"><records count="1" morepages="false" start="1" end="1"><record rownumber="1"><dc:product_type>Journal Article</dc:product_type><dc:title>On Learning Ising Models under Huber's Contamination Model</dc:title><dc:creator>Prasad, Adarsh; Srinivasan, Vishwak; Balakrishnan, Sivaraman; Ravikumar, Pradeep</dc:creator><dc:corporate_author/><dc:editor>null</dc:editor><dc:description>We study the problem of learning Ising models in a setting where some of the
samples from the underlying distribution can be arbitrarily corrupted. In such
a setup, we aim to design statistically optimal estimators in a high-dimensional
scaling in which the number of nodes p, the number of edges k and the maximal
node degree d are allowed to increase to infinity as a function of the sample size n.
Our analysis is based on exploiting moments of the underlying distribution, coupled
with novel reductions to univariate estimation. Our proposed estimators achieve an
optimal dimension independent dependence on the fraction of corrupted data in the
contaminated setting, while also simultaneously achieving high-probability error
guarantees with optimal sample-complexity. We corroborate our theoretical results
by simulations.</dc:description><dc:publisher/><dc:date>2020-01-01</dc:date><dc:nsf_par_id>10222690</dc:nsf_par_id><dc:journal_name>Advances in neural information processing systems</dc:journal_name><dc:journal_volume>33</dc:journal_volume><dc:journal_issue/><dc:page_range_or_elocation/><dc:issn>1049-5258</dc:issn><dc:isbn/><dc:doi>https://doi.org/</dc:doi><dcq:identifierAwardId>1955532</dcq:identifierAwardId><dc:subject/><dc:version_number/><dc:location/><dc:rights/><dc:institution/><dc:sponsoring_org>National Science Foundation</dc:sponsoring_org></record></records></rdf:RDF>