Indranil Ghosh
In modeling complicated real-life scenarios, one objective is to capture the dependence being observed. Consequently, conditional specification is a worthy alternative to the joint-distribution models. Since its’ inception, the use of divergence measures have been instrumental in determining the closeness between two probability distributions, especially when joint distributions are specified by the corresponding conditional distributions. Conditional specification of distributions is a developing area with several applications. This work gives an overview of a variety of divergence measures including, but not limited to, Kullback-Leibler divergence measure, Power-divergence statistic, Hellinger distance along with some newly developed divergence measures and its role in addressing various compatible conditions in search for a most-nearly compatible for a finite discrete case, and also identifying compatibility under conditional and marginal information under some additional information in the form of marginal and/or conditional summary. Finally, we provide some numerical examples to illustrate each of the scenarios.
© 2008-2025 Fundación Dialnet · Todos los derechos reservados