Nuclear Science and Engineering / Volume 165 / Number 1 / May 2010 / Pages 1-17
Technical Paper / dx.doi.org/10.13182/NSE09-37A
Articles are hosted by Taylor and Francis Online.
When n measurements and/or computations of the same (unknown) quantity yield data points x
, the probability that two equally precise measurements would be separated by more than 2
is erfc(1) [approximately equal] 0.157), it is much more likely that apparently discrepant data actually indicate the presence of unrecognized errors.
This work addresses the treatment of unrecognized errors by applying the maximum entropy principle under quadratic loss, to the discrepant data. Novel results are obtained for the posterior distribution determining the unknown mean value (i.e., unknown location parameter) of the data and also for the marginal posterior distribution of the unrecognized errors. These novel results are considerably more rigorous, are more accurate, and have a wider range of applicability than extant recipes for handling discrepant data.