Definition Let X and Y be two discrete random variables taking values in AX and respectively. The joint entropy is defined as: H(X,Y)=E[log2p(X,Y)1]=∑p(x,y)log2p(x,y)1[bits]