Professional Documents
Culture Documents
www.elsevier.com/locate/laa
Abstract
We consider the problem of estimating a function of the multivariate mean, where in good
approximation this function can be written as a sum of a linear and a quadratic form. For
a natural class of estimators we obtain mean and variance. Since normality is not assumed,
variance turns out to be a rather involved quantity, dependent on third and fourth moment
matrices.
© 2002 Elsevier Science Inc. All rights reserved.
AMS classification: 15A63; 62H12
Keywords: Multivariate analysis; Nonlinear function of the mean; Estimation; Third and fourth moment
matrices
1. Introduction
0024-3795/02/$ - see front matter 2002 Elsevier Science Inc. All rights reserved.
PII: S 0 0 2 4 - 3 7 9 5( 0 2) 0 0 3 3 2 - 4
224 H. Neudecker, G. Trenkler / Linear Algebra and its Applications 354 (2002) 223–229
with some suitably chosen nonstochastic quantities f0 , f and F. The vector vec X is
obtained from the matrix X by stacking its columns one underneath the other, i.e.
vec X = (x1 , . . . , xn ) .
Since E(vec X ) = 1n ⊗ and Cov(vec X ) = In ⊗ we immediately have
(cf. [4, p.13]), where 1n is the n × 1 vector of ones. However, when calculating
Var(T ) we need a relationship between the third and fourth moment matrices of
the xi and vec X , respectively. This relationship will be established in the following
section.
2. Results
zz ⊗ zz
zz ⊗ z1 z1 ··· zz ⊗ z1 zn
.. ..
= (In ⊗ Kp,np ) . . (In ⊗ Knp,p ). (2.14)
zz ⊗ zn z1 ··· zz ⊗ zn zn
226 H. Neudecker, G. Trenkler / Linear Algebra and its Applications 354 (2002) 223–229
In addition, for i =
/ j we get
(i, j )-block
0
(vec )(vec )
E(zz ⊗ zi zj ) =
0
Kpp ( ⊗ )
0
(j, i)-block
= Eij ⊗ (vec )(vec ) + Ej i ⊗ Kpp ( ⊗ ), (2.17)
This implies
(ii) ∗ = N
∗ = (In2 p 2 + Knp,np )(In ⊗ ⊗ In ⊗ )
+ [vec (In ⊗ )][vec (In ⊗ )] .
Remark 1. The preceding result is in accordance with Theorem 4.3(iv) in [2] when
replacing x by z. Then z ∼ N(0, In ⊗ ), E(z ⊗ z) = vec(In ⊗ ) and Cov(z ⊗ z) =
(In2 p2 + Knp,np )(In ⊗ ⊗ In ⊗ ). On the other hand, Cov(z ⊗ z) = E[(z ⊗ z)(z ⊗
z )] − E(z ⊗ z)E(z ⊗ z ) = N
∗ − [vec (In ⊗ )][vec (In ⊗ )] which coincides
with identity (ii) of the corollary after solving for ∗ .
N
∗ = N
∗ + (In ⊗ Kpn ⊗ Ip )[K̃nn ⊗ ( − )](In ⊗ Knp ⊗ Ip ).
N
(2.27)
Remark 3. Frauendorf and Trenkler [1] considered the problem under which con-
ditions the statistics
H. Neudecker, G. Trenkler / Linear Algebra and its Applications 354 (2002) 223–229 229
1
T1 = a0 + a x̄ + x̄ Ax̄, where x̄ = X 1n , (2.28)
n
and
1
n
T2 = a0 + a x̄ + xi Axi coincide. (2.29)
n
i=1
Observe that both T1 and T2 are special cases of the statistic T of (1.1). In the above
mentioned paper it was shown that T1 and T2 differ in general, and
1
E(T1 ) = f () + tr(A), (2.30)
n
Acknowledgement
The authors would like to thank the referee for his helpful comments.
References
[1] E. Frauendorf, G. Trenkler, On the exchangeability of transformations and the arithmetic mean,
J. Statist. Comput. Simulation 61 (1998) 305–307.
[2] J.R. Magnus, H. Neudecker, The commutation matrix: some properties and applications, Ann. Statist.
7 (1979) 381–394.
[3] H. Neudecker, T. Wansbeek, Some results on commutation matrices, with statistical applications,
Canad. J. Statist. 11 (1983) 221–231.
[4] G.A.F. Seber, Linear Regression Analysis, John Wiley, New York, 1977.