Estimators of divergence criteria for two normal distributions with Bayesian approach

Document Type : Original Scientific Paper

Authors

Department of Statistics‎, ‎Payame Noor University‎, ‎Tehran‎, ‎Iran

Abstract

The use of statistical distributions for modeling‎, ‎specifically evaluating the similarity between two probability distributions using various divergence measures‎, ‎has recently attracted the attention of many of researchers to measure in context of machine learning‎. ‎Given the importance of the topic‎, ‎this article introduces several the divergence criteria‎, ‎including Kullback-Leibler divergence‎, ‎total variation divergence‎, ‎alpha divergence‎, ‎and power divergence‎, ‎and computes the divergence parameters for two normal distributions‎. ‎The parameters are estimated using both maximum likelihood and Bayesian methods‎. ‎In the Bayesian approach‎, ‎a conjugate distribution is used as the prior‎, ‎taking into account the behavior of the parameters‎. ‎Finally‎, ‎the estimation methods for two normal distributions are evaluated bsased on the mean square error criterion.

Keywords

Main Subjects