prev next front |1 |2 |3 |4 |5 |6 |7 |8 |9 |10 |11 |12 |13 |14 |15 |16 |17 |18 |19 |20 |21 |22 |23 |review
Claude Shannon in a famous paper in 1948 defined the information communciated to a receiver as the reduction in the receivers uncertainty about some possible states of the world. He showed how to compute the uncertainty (entropy) before and after the communication. The information communicated is the entropy(uncertainty) before the communication minus the uncertainty (entropy) afterwards. This definition of information has become the foundation of a vast theoretical and practical enterprise in physics and engineering.