skip to main content


Search for: All records

Award ID contains: 1702555

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. The estimation of a meaningful affinity graph has become a crucial task for representation of data, since the underlying structure is not readily available in many applications. In this paper, a topology inference framework, called Bayesian Topology Learning, is proposed to estimate the underlying graphtopologyfromagivensetofnoisymeasurementsofsignals. It is assumed that the graph signals are generated from GaussianMarkovRandomFieldprocesses. First,usingafactor analysis model, the noisy measured data is represented in a latent space and its posterior probability density function is found. Thereafter, by utilizing the minimum mean square error estimator and the Expectation Maximization (EM) procedure, a filter is proposed to recover the signal from noisy measurements and an optimization problem is formulated to estimatetheunderlyinggraphtopology. Theexperimentalresults show that the proposed method has better performance whencomparedtothecurrentstate-of-the-artalgorithmswith different performance measures. 
    more » « less
  2. One of the difficulties of implementing and analyzing algorithms that achieve information theoretic limits is adapting asymptotic results to the finite block-length regime. Results on secrecy for both regimes utilize Shannon entropy and mutual information as metrics for security. In this paper, we determine that Shannon entropy does not necessarily have equal utility for wireless authentication in finite block-length regimes with a focus on the fingerprint embedding framework. Then, we apply a new security performance metric to the framework that is linked to min-entropy rather than Shannon entropy and is similar to cheating probability used in the literature. The metric is based upon an adversary's ability to correctly guess the secret key over many observations using maximum likelihood decoding. We demonstrate the effect that system parameters such as the length of the key and the identification tag have on an adversary's ability to attack successfully. We find that if given a large key, it is better to use it all at once, than to use some and then renew the key with the remaining bits after a certain number of transmissions. 
    more » « less
  3. We apply artificial noise to the fingerprint embedding authentication framework to improve information-theoretic authentication for the MISO channel. Instead of optimizing for secrecy capacity, we examine the trade-off between message rate, authentication, and key security. In this case, key security aims to limit an adversary’s ability to obtain the key using a maximum likelihood decoder. 
    more » « less
  4. The rate regions of many variations of the standard and wire-tap channels have been thoroughly explored. Secrecy capacity characterizes the loss of rate required to ensure that the adversary gains no information about the transmissions. Authentication does not have a standard metric, despite being an important counterpart to secrecy. While some results have taken an information-theoretic approach to the problem of authentication coding, the full rate region and accompanying trade-offs have yet to be characterized. In this paper, we provide an inner bound of achievable rates with an average authentication and reliability constraint. The bound is established by combining and analyzing two existing authentication schemes for both noisy and noiseless channels. We find that our coding scheme improves upon existing schemes. 
    more » « less
  5. In Internet of Things (IoT) applications requiring parameter estimation, sensors often transmit quantized observations to a fusion center through a wireless medium where the observations are susceptible to unauthorized eavesdropping. The fusion center uses the received data to estimate desired parameters. To provide security to such networks, some low complexity encryption approaches have been proposed. In this paper, we generalize those approaches and present an analysis of their estimation and secrecy capabilities. We show that the dimension of the unknown parameter that can be efficiently estimated using an unbiased estimator when using these approaches, is upper bounded. Assuming that an unauthorized eavesdropper is aware of the low complexity encryption process but is unaware of the encryption key, we show successful eavesdropping, even with a large number of observations, is impossible with unbiased estimators and independent observations for these approaches. Numerical results validating our analysis are presented. 
    more » « less