Friday, April 20, 2007

Shannon & Hartley's Measures

Shannon's measure of Entropy is the average shortest average message in bits that a certain amount of information can be conveyed with out any information loss. The Entropy measures how much information is lost in data compression and uses specific Algorithms to measure this.

Hartley's Function is actually a special case of Shannon's entropy so it's extremely similar to Shannon's. The only difference is that Hartley's equation takes into effect the a log of a varying base in a set of information. While Shannon only takes the log of base 2. Hartley's function is more rare and typically only happens in certain situations of measuring information. Shannon's measure is applicable to a much wider array of information.

No comments: