# GALLERY: H X Y

The joint Shannon entropy (in bits ) of two discrete random variables X {\displaystyle X} and Y {\displaystyle Y} with images X {\displaystyle {\mathcal {X}}} and Y {\displaystyle {\mathcal {Y}}} is defined as

In probability theory and information theory , the mutual information ( MI ) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons , commonly called bits) obtained about one random variable through observing the other random variable. The concept of mutual information is intricately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected " amount of information " held in a random variable.