Image:Binary entropy plot.png

From Free net encyclopedia

No file by this name exists, you can upload it

Information entropy of a Bernoulli trial X. If X can assume values 0 and 1, entropy of X is defined as H(X) = -Pr(X=0) log2 Pr(X=0) - Pr(X=1) log2 Pr(X=1). It has value if Pr(X=0)=1 or Pr(X=1)=1. The entropy reaches maximum when Pr(X=0)=Pr(X=1)=1/2 (the value of entropy is then 1).


Copyright Brona. Original LaTeX/pstricks sources stored at User:Brona/Images/binary_entropy_plot.tex.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts.
Subject to disclaimers.

There are no pages that link to this file.