Article

Information theory



Please Share this article is useful !
Information Theory (UK: information theory) is the discipline in the field of applied mathematics that deals with the quantization data so that data or information that can be stored and shipped without errors (error) through a communication channel. Information entropy (information entropy) is often used as a tool for this purpose, and is usually expressed as the average number of bits required for storage and delivery of information. For example, if the daily weather conditions expressed by the entropy of 3 bits, then we say that the weather has an average of 3 bits each day.

Application of basic topics in the theory of information include data compression without disabilities (lossless data compression, the ZIP file for example), data compression (lossy data compression, the MP3 files, for example), and channel coding (channel coding, the DSL, ADSL, etc. ). Usually, information theory is the intersection of the fields of mathematics, statistics, computer science, physics, neurobiology, and electrical and computer engineering. Implementation of this theory impacted directly by the space mission, an understanding of black holes in galaxies, with research linguistika and human perception, with a network of computers, the Internet and mobile phone networks.

a. Bernoulli binary entropy function

In particular, information theory is a branch of mathematical odds and statistics, relating to the concept of information and entropy of the information as described above. Claude E. Shannon (1916-2001) is known as "the father of information theory".


Thank you for reading and sharing this article !

Free Articles! Please enter your email.
Print PDF

You Might Also Like:

Next
« Prev Post
Previous
Next Post »
Copyright © 2012 My Article - All Rights Reserved