purpose Physicists long balked at colloquial usages of “entropy” (especially by philosophers). I believe some usages of entropy as resistance to systematic understanding has precise mathematical grounding in information theory. I am tired of arguing with my physics major friends over this. definitions Shannon entropy H(X) for discrete random variable X: H(X) = -∑ p(x) log₂(p(x)) measures average bits needed to encode X’s outcomes. For any compression scheme C mapping sequences of X ...