<?xml-model href='http://www.tei-c.org/release/xml/tei/custom/schema/relaxng/tei_all.rng' schematypens='http://relaxng.org/ns/structure/1.0'?><TEI xmlns="http://www.tei-c.org/ns/1.0">
	<teiHeader>
		<fileDesc>
			<titleStmt><title level='a'>"Hilbert Space Geometry of Quadratic Covariance Bounds, Asilomar Conference on Signals, Systems, and Computers, Oct 30-N0v 1, 2018.</title></titleStmt>
			<publicationStmt>
				<publisher></publisher>
				<date>10/30/2017</date>
			</publicationStmt>
			<sourceDesc>
				<bibl> 
					<idno type="par_id">10058158</idno>
					<idno type="doi"></idno>
					<title level='j'>Conference record - Asilomar Conference on Signals, Systems, &amp; Computers</title>
<idno>1058-6393</idno>
<biblScope unit="volume"></biblScope>
<biblScope unit="issue"></biblScope>					

					<author>W. Moran S.D. Howard</author>
				</bibl>
			</sourceDesc>
		</fileDesc>
		<profileDesc>
			<abstract><ab><![CDATA[In this paper, we study the geometry of quadratic covariance bounds on the estimation error covariance, in a properly defined Hilbert space of random variables. We show that a lower bound on the error covariance may be representedby the Grammian of the error score after projection onto the subspace spanned by the measurement scores. The Grammian is defined with respect to inner products in a Hilbert space of second order random variables. This geometric result holdsfor a large class of quadratic covariance bounds including the Barankin, Cramer-Rao, and Bhattacharyya bounds, where each bound is characterized by its corresponding measurement scores. When parameters consist of essential parameters and nuisance parameters, the Cram´er-Rao covariance bound is the inverse of the Grammian of essential scores after projection onto the subspace orthogonal to the subspace spanned by the nuisance scores. In two examples, we show that for complex multivariate normal measurements with parameterized mean or covariance, there exist well-known Euclidean space geometries for the generalHilbert space geometry derived in this paper.]]></ab></abstract>
		</profileDesc>
	</teiHeader>
	<text><body xmlns="http://www.tei-c.org/ns/1.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xlink="http://www.w3.org/1999/xlink">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>I. INTRODUCTION</head><p>In <ref type="bibr">[1]</ref> the authors showed that the Cram&#233;r Rao bound (CRB) <ref type="bibr">[2]</ref>, <ref type="bibr">[3]</ref> on the variance of an unbiased estimator of a parameter &#952; i in the measurement model y &#8764; N n (x(&#952;), &#963; 2 I), x &#8712; R n , &#952; &#8712; R p , p &#8804; n could be written as</p><p>where g i = &#8706;x(&#952;) &#8706;&#952;i characterizes the sensitivity of the mean to the i th parameter, G = [g 1 , . . . , g p ], G i consists of all columns of G except g i , P &#8869; Gi = I -P Gi , and P Gi is the orthogonal projection onto the subspace G i . The denominator in <ref type="bibr">(1)</ref> is the Euclidean inner product g i , P &#8869; Gi g i , and the geometry is shown in Fig. <ref type="figure">1</ref>. This result raises the question of whether there exists a more general version of the geometry illustrated in Fig. <ref type="figure">1</ref>. In fact, we were motivated to find a similar geometry for the case where &#952; parameterizes the covariance matrix in the multivariate normal model. With this motivation, our ambition in this paper is to illuminate the geometry of the Cram&#233;r-Rao bound in This work is supported in part by NSF under grant CCF-1712788, and by the Air Force Office of Scientific Research under award number FA9550-14-1-0185. Consequently the U.S. Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation thereon. We start with a two channel linear estimation problem and derive the geometry of a lower bound on the Grammian of the estimation error. Then, we exploit this estimation result to discuss the geometry of quadratic covariance bounds. The quadratic covariance bounds can be derived as bounds on the minimum error covariance when linearly estimating the centered error scores from centered measurement scores. Different classes of quadratic covariance bounds are characterized by their associated measurement scores. The conceptual framework is the Hilbert space of second order random variables, but when specialized to the Cram&#233;r-Rao bound in a multivariate normal model, the Hilbert space inner products reduce to Euclidean inner products.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>II. PRELIMINARIES</head><p>Let H be a Hilbert space, with the inner product &#8226;, &#8226; : H &#215; H &#8594; C, and the associated norm &#8226; . For any positive integer q denote the direct sum of q copies of H by H q . H q is a Hilbert space with an appropriate choice of inner product. For any ordered set of vectors u = (u 1 , . . . , u q ) &#8712; H q , the Gram matrix, or Grammian K(u) &#8712; C q&#215;q is defined to have elements</p><p>N o w ,g i v e na n o t h e r v=(v 1 , . . . , v p )&#8712; H p , w ed e fi n et h e c r o s sG r amm a t r i xK( u ; v )b e tw e e n t h e tw os e t sa s t h eq &#215;p m a t r i xw i t he l em e n t s K( u ; v ) i j = u i , v j .</p><p>(</p><p>T h es t a n d a r d i n n e rp r o d u c to nH q i sT r ( K( u ; v ) ) .</p><p>W i t h a n ym a t r i x T&#8712;C m&#215;P w e c a n a s s o c i a t e a l i n e a ro p e r a t o r L T : H p &#8594; H m g i v e nb y L T v= p j = 1 t 1 j v j , p j = 1 t 2 j v j , . . . ,</p><p>w h e r e{ t i j }a r e t h ee l em e n t so fT.</p><p>T h e G r am m a t r i xK( u )a n d t h ec r o s s G r am m a t r i xK( u ; v ) h a v e t h e f o l l o w i n gp r o p e r t i e s :</p><p>1 )K( u ) 0 .</p><p>2 )K( u ; v )= K H ( v ; u ) .</p><p>3 )F o rc 1 , c 2 &#8712;C, K( c 1 u 1 +c 2 u 2 ; v 1 ) =c 1 K( u 1 ; v 1 )+ c 2 K( u 2 ; v 1 ) .</p><p>4 )F o ra r b i t r a r yT 1 , T 2 &#8712;C m&#215;q a n dT 3 &#8712;C ,w eh a v e</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>I I I . T H ETWO-CHANN E LL IN EARE S T IMA T ION</head><p>EX P ER IM EN T C o n s i d e ratw oc h a n n e le s t im a t i o np r o b l em w h e r et h ee l em e n t so f u =(u 1 , . . . ,u q )a r et ob ee s t im a t e df r omt h e e l em e n t so fv =(v 1 , . . . , v p ) .F o rs im p l i c i t y ,l e tK uu = K</p><p>w h e r eI q i s t h e i d e n t i t y m a t r i xo fs i z e q .T h e r e f o r e ,d e fi n i n g</p><p>T h u s ,L KuvK 1 vv vi s t h eb e s t l i n e a re s t im a t o ro f t h ee l em e n t s o fu=(u 1 , . . . ,u q )f r om t h ee l em e n t so fv=(v 1 , . . . , v p ) .</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>T h a t i s ,d e fi n i n g t h ee s t im a t i o ne r r o r&#950;= u-L</head><p>w h e r e t h e l a s t i n e q u a l i t yc om e s f r om t h e f a c t s t h a tK( v ; e )= 0 , a n d (</p><p>vv vi s t h eo r t h o g o n a lp r o j e c t i o no f uo n t o t h e s u b s p a c e s p a n n e db yt h ee l em e n t so fv , w h i c h w e w r i t ea sP v u = L KuvK 1 vv v .F u r t h e rm o r e , K(</p><p>w h e r e t h es e c o n de q u a l i t yc om e sf r omp r o p e r t y4 i nS e c t i o n I I ,a n d</p><p>T h e r e f o r e ,w ec a nd e c om p o s e t h eG r amm i a nK uu a s</p><p>N o w ,c o n s i d e r t h es p e c i a lc a s e w h e r e K u v = [I q 0 q &#215;( p -q ) ] f o rs om e p&gt;q .F r om ( 1 2 )w eh a v e</p><p>w h e r e K -1 v v q q i s t h en o r t hw e s t q &#215;qb l o c ko fK</p><p>T h e r e f o r e , f r om ( 1 3 )a n d ( 1 4 ) ,w eh a v e</p><p>w h e r e P &#8869;</p><p>v 2 i st h eo r t h o g o n a lp r o j e c t i o no fv 1 =(v 1 , . . . , v q )o n t ot h e s u b s p a c es p a n n e db yt h ee l em e n t so fv 2 =(v q + 1 , . . . , v p ) . E q u a t i o n ( 1 5 ) i so u rm a i n r e s u l t .T h e e l em e n t so fK a r eH </p><p>T h ec om p o s i t ec o v a r i a n c e m a t r i x f o r[ e T &#952; ( y ) , s T &#952; ( y ) ] T i s ( </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="8">) w h e r eT( &#952; )= K( s &#952; ( y ) ; e &#952; ( y ) )i s t h e s e n s i t i v i t ym a t r i x ,a n d J ( &#952; )= K( s &#952; ( y ) ) i s t h e i n f o rm a t i o nm a t r i x ,b o t hd e fi n e dw i t h r e s p e c t t o t h e i n n e rp r o d u c t i n ( 1 7 ) .B a s e do n t h e r e s u l t i n</head></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="9">) i sd e r i v e da l s o i n [ 7 ] ,w h e r e i t i sd em o n s t r a t e d t h a ts c o r e f u n c t i o n s w i t hz e r o m e a n w h i c ha r ef u n c t i o n so fs u f fi c i e n t s t a t i s t i c s f o r t h ep a r am e t e r sp r o v i d e t i g h t e rb o u n d so n t h e e r r o r c o v a r i a n c e m a t r i x t h a ns c o r e s t h a ta r en o tz e r o m e a n ,o ra r e n o t f u n c t i o n so fs u f fi c i e n ts t a t i s t i c s f o r t h ep</head><p>w h i c him p l i e st h a tt h ee l em e n t s o f e &#952; ( y )b e l o n gt ot h e s u b s p a c e s 1 , . . . , s m .</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>V . FI SH ERS COR EANDTH EC RAM&#201;R -R AOB OUND</head><p>A sa s p e c i a le x am p l eo f t h eg e om e t r i c a l i n t e r p r e t a t i o n i n ( 1 9 ) , w ec o n s i d e r t h eC r am&#233; r -R a ob o u n do n t h ee r r o rc o v a r i a n c eo f a nu n b i a s e de s t im a t o ro f t h ep a r am e t e r s&#952;= [&#952; 1 , . . . , &#952; q ] T &#8712; R q .T h eF i s h e rs c o r e i sd e fi n e da s</p><p>w h i c hh a sz e r o m e a n <ref type="bibr">[ 8 ]</ref> .T h u s , t h ec e n t e r e dF i s h e rs c o r e i s s ( y , &#952; )= &#963; &#952; ( y ) ,a n d t h eF i s h e r i n f o rm a t i o nm a t r i x i s J ( &#952; )= E[ s ( y , &#952; ) s H ( y , &#952; ) ] .F r om t h ep r o p e r t i e so f t h e s c o r e f u n c t i o n i n( 2 1 ) ,t h es e n s i t i v i t y m a t r i xi sT( &#952; ) =[ I q 0 q &#215;( p -q ) ] .</p><p>T h e r e f o r e , t h eg e n e r a l r e s u l to f ( 1 9 )s p e c i a l i z e s t o</p><p>w h e r e J -1 ( &#952; ) q q i s t h e q&#215;qn o r t hw e s tb l o c ko f i n v e r s eo f t h eF i s h e r i n f o rm a t i o nm a t r i x J -1 ( &#952; ) .B u t f r om ( 1 5 ) , t h i sm a y b ew r i t t e na s C o v &#952; { &#285; ( y ) } J -1 ( &#952; ) q q = K( P &#8869; s 2 s 1 )</p><p>w h e r e s 1 =(s 1 , . . . , s q ) , s 2 =(s q + 1 , . . . , s p )a n dP &#8869; s 2 s 1 = s 1 -P s 2 s 1 ,a n dP s 2 s 1 i st h eo r t h o g o n a lp r o j e c t i o no ft h e e l em e n t so fs 1 o n t o t h es u b s p a c es p a n n e db y t h ee l em e n t so f s 2 .T h em a t r i x K( P &#8869; s 2 s 1 )i s t h eG r amm i a no fe s s e n t i a ls c o r e s s 1 ,a f t e rp r o j e c t i o no n t ot h es u b s p a c es p a n n e db yn u i s a n c e s c o r e s s 2 .T h ee l em e n t so f K a r eH i l b e r t s p a c e i n n e rp r o d u c t s d e fi n e db y ( 1 7 ) .T h eC r am &#233; r -R a ob o u n do n t h ee r r o rv a r i a n c e o fa nu n b i a s e de s t im a t o ro f t h ep a r am e t e r&#952; 1 i s v a r &#952; { &#952; 1 ( y ) }&#8805; J -1 ( &#952; ) 1 1</p><p>where s 2 = (s 2 , . . . , s p ). Importantly, the denominator in ( <ref type="formula">24</ref>) is a Hilbert space inner product defined by (17).</p><p>We demonstrate two examples for which there exists a Euclidean space geometry counterpart for the Hilbert space geometry of the Cram&#233;r-Rao bound.</p><p>Example 1: Complex multivariate normal measurements with parameterized mean.</p><p>Assume the measurement y is a proper random vector distributed as CN n (x(&#952;), C), x &#8712; C n , and &#952; &#8712; R p . Let q i = C -1/2 &#8706;x(&#952;) &#8706;&#952;i and define g i = (q T i , q H i ) T . The (i, j) th element of the the Fisher information matrix is the Euclidean inner product of the g i and g j <ref type="bibr">[1]</ref>, <ref type="bibr">[8]</ref>. That is</p><p>Define G 1 = [g 1 , . . . , g q ], G 2 = [g q+1 , . . . , g p ]. From (23), the Cram&#233;r-Rao bound on the error covariance of an unbiased estimator &#952;(y) of &#952; = [&#952; 1 , . . . , &#952; q ] may be written as</p><p>where</p><p>is the orthogonal projection matrix onto the subspace spanned by the columns of G 2 . In this case the Hilbert space inner products of (23) are computed as Euclidean inner products in C 2n .</p><p>Example 2: Complex multivariate normal measurements with parameterized covariance. Assume the measurement y is a proper random vector distributed as CN n (m, R(&#952;)), m &#8712; C n , and &#952; &#8712; R p . Let D i = R -1/2 (&#952;) &#8706;R(&#952;)  &#8706;&#952;i R -1/2 (&#952;). The (i, j) th element of the the Fisher information matrix may be written as an inner product of D i and D j <ref type="bibr">[8]</ref>. That is</p><p>Define D 1 = (D 1 , . . . , D q ), D 2 = (D q+1 , . . . , D p ). Again, from (23), the Cram&#233;r-Rao bound on the error covariance of an unbiased estimator &#952;(y) of &#952; = [&#952; 1 , . . . , &#952; q ] may be written as</p><p>where the orthogonal projection P &#8869; D2 D 1 , and K(P &#8869; D2 D 1 ) in (28) are defined with respect to the inner product in (27). Again, the Hilbert space inner products of (23) are replaced by the Euclidean inner products in C n&#215;n defined in (27).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>VI. CONCLUSION</head><p>A general class of quadratic covariance bounds on estimation error covariance may be represented as the Grammian of the error score after projection onto the space orthogonal to the subspace spanned by the measurement scores. This is the Hilbert space picture, as the Grammian is defined with respect to inner products in a Hilbert space of second order random variables. This geometric result may be applied to a large class of quadratic covariance bounds such as Barankin, Cram&#233;r-Rao, and Bhattacharyya bounds, by considering their corresponding measurement scores. In the case of Fisher score, the bound is determined by the inverse of the Grammian of essential scores after projection onto the subspace orthogonal to the subspace spanned by the nuisance scores, a result that clarifies the influence of nuisance parameters on parameter estimation.</p></div></body>
		</text>
</TEI>
