COMPUTATION OF INFORMATION MEASURES

Author/​Artist
Zhan, Shuxin [Browse]
Format
Senior thesis
Language
English
Description
26 pages

Details

Advisor(s)
Verdu, Sergio [Browse]
Contributor(s)
Lieb, Elliott [Browse]
Department
Princeton University. Department of Mathematics [Browse]
Class year
2015
Summary note
For well-behaved distributions, mutual information can computed using a simple identity with the two distribution’s marginal and conditional entropies. However, when these entropies are ill-defined, more powerful methods are required. This thesis aims to calculate the mutual information of one such distribution given by p(x) = 1/xlog2(x). This is the first known attempt to approximate mutual information of distributions such as these. While I was able to numerically approximate the mutual information of this distribution as well as find meaningful lower bounds, proving the existence of an upper bound remains an open problem.

Supplementary Information