%0 Journal Article %T Kullback–Leibler Divergence Between Multivariate Generalized Gaussian Distributions %+ Lab-STICC_ENSTAB_MOM_PIM %+ Centre d'études et de recherche en informatique et communications (CEDRIC) %A Bouhlel, Nizar %A Dziri, Ali %< avec comité de lecture %@ 1070-9908 %J IEEE Signal Processing Letters %I Institute of Electrical and Electronics Engineers %V 26 %N 7 %P 1021-1025 %8 2019-07 %D 2019 %R 10.1109/LSP.2019.2915000 %K Kullback-Leibler divergence KLD %K Multivariate generalized Gaussian distribution %K Lauricella function %Z Engineering Sciences [physics] %Z Engineering Sciences [physics]/Signal and Image processingJournal articles %X The Kullback-Leibler divergence (KLD) between two multivariate generalized Gaussian distributions (MGGDs) is a fundamental tool in many signal and image processing applications. Until now, the KLD of MGGDs has no known explicit form, and it is in practice either estimated using expensive Monte-Carlo stochastic integration or approximated. The main contribution of this letter is to present a closed-form expression of the KLD between two zero-mean MGGDs. Depending on the Lauricella series, a simple way of calculating numerically the KLD is exposed. Finally, we show that the approximation of the KLD by Monte-Carlo sampling converges to its theoretical value when the number of samples goes to the infinity. %G English %L hal-02304988 %U https://ensta-bretagne.hal.science/hal-02304988 %~ UNIV-BREST %~ INSTITUT-TELECOM %~ ENSTA-BRETAGNE %~ CNRS %~ UNIV-UBS %~ CNAM %~ ENSTA-BRETAGNE-STIC %~ ENIB %~ LAB-STICC %~ CEDRIC-CNAM %~ INSTITUTS-TELECOM %~ TEST3-HALCNRS %~ HESAM-CNAM %~ HESAM