Pantelimon G. Popescu, Sever S. Dragomir,
Emil I. Slusanschi, Octavian N. Stanasila
Abstract:
Entropy, conditional entropy and mutual information for discrete-valued
random variables play important roles in the information theory.
The purpose of this paper is to present new bounds for relative
entropy D(p||q) of two probability distributions and then to apply
them to simple entropy and mutual information. The relative entropy
upper bound obtained is a refinement of a bound previously presented
into literature.
Submitted June 19, 2016. Published August 30, 2016.
Math Subject Classifications: 26B25, 94A17.
Key Words: Entropy; bounds; refinements; generalization.
Show me the PDF file (163 KB), TEX file for this article.
![]() |
Pantelimon George Popescu Computer Science and Engineering Departament Faculty of Automatic Control and Computers University Politehnica of Bucharest Splaiul Independen\c{t}ei 313, 060042, Bucharest (6), Romania email: pgpopescu@yahoo.com, Phone +40741533097, Fax +40214029333 |
---|---|
![]() |
Sever Silvestru Dragomir College of Engineering and Science Victoria University, PO Box 14428 Melbourne City, MC 8001, Australia email: sever.dragomir@vu.edu.au, Phone +61 3 9919 4437, Fax +61 3 9919 4050 |
![]() |
Emil Ioan Slusanschi Computer Science and Engineering Departament Faculty of Automatic Control and Computers University Politehnica of Bucharest Splaiul Independentei 313, 060042, Bucharest (6), Romania email: emil.slusanschi@cs.pub.ro, Phone +40741533097, Fax +40214029333 |
![]() |
Octavian Nicolae Stanasila Computer Science and Engineering Departament Faculty of Automatic Control and Computers University Politehnica of Bucharest Splaiul Independentei 313, 060042, Bucharest (6), Romania email: ostanasila@hotmail.com, Phone +40741533097, Fax +40214029333 |
Return to the EJDE web page