*** Welcome to piglix ***

Joint entropy


In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.

The joint Shannon entropy (in bits) of two discrete random variables and is defined as

where and are particular values of and , respectively, is the joint probability of these values occurring together, and is defined to be 0 if .


...
Wikipedia

...