Talk:Second order efficiency

From Scholarpedia
Jump to: navigation, search

    The coverage of the Fisher-Rao theorem is good. But the references and discussion on the Rao theorem is quite incomplete. Actually, this theorem has been extended well beyond curved exponential families, using Bayesian methods. Full details can be found in my monograph on Higher Order Asymptotics (IMS monograph, 1994). A brief discussion appears below.

    Ghosh and Subramaniyam (Sankhya, Ser A, 1974) show that Bayes estimates can be approximated up to second order by a function of the mle alone, the derivatives at mle are not required. (Such results do not hold for Bayes tests.) It is conjectured there that this can be used to prove Rao's theorem under general regularity conditions and for more general loss functions than squared error. This program is implemented in Ghosh, Sinha and Wieand (Annals of Statistics, 1980). A slightly different proof is offered in Ghosh, Sinha and Joshi (Proceedings of third Purdue Symposium, 1982). Essentially the same result was obtained by Takeuchi and Akahira, and Bickel, Goetze and van Zwet. References to all the above papers are available in my monograph cited above.

    Finally, Rao's second order efficiency is usually called third order efficiency by other authors. If one considers the asymptotic expansion of expected squared error loss of an estimator up to O(1/n^2), one gets two terms in powers of 1/n, hernce the term second order. If one approaches through Edgeworth expansions, one has three terms in powers of (1/squareroot(n)), hence third order. Second order efficiency in this sense is different from Rao's.

    Personal tools
    Namespaces

    Variants
    Actions
    Navigation
    Focal areas
    Activity
    Tools