Consistency, Efficiency and Robustness of Conditional Disparity Methods, Giles Hooker, 4-16-14

Consistency, Efficiency and Robustness of Conditional Disparity Methods

Giles Hooker(Submitted on 14 Jul 2013 (v1), last revised 16 Apr 2014 (this version, v2))

This paper considers extensions of minimum-disparity estimators to the problem of estimating parameters in a regession model that is conditionally specified; ie where the model gives the distribution of a response y conditional on covariates x but does not specify the distribution of x. The consistency and asymptotic normality of such estimators is demonstrated for a broad class of models that incorporates both discrete and continuous response and covariate values and is based on a wide set of choices for kernel-based conditional density estimation. It also establishes the robustness of these estimators for a wide class of disparities. As has been observed in Tamura and Boos (1986), kernel density estimates of more than one dimension can result in an asymptotic bias that is larger that n−1/2 in minimum disparity estimators and we characterize a similar bias in our results and show that in specialized cases it can be eliminated by appropriately centering the kernel density estimate. In order to demonstrate these results, we establish a set of L1-consistency results for kernel-based estimates of centered conditional densities.

Subjects: Statistics Theory (math.ST)Cite as: arXiv:1307.3730 [math.ST]  (or arXiv:1307.3730v2 [math.ST] for this version)