théorie de l'information
Get flash to fully experience Pearltrees
School of Information Previously School of Library & Information Studies
Information theory is a branch of applied mathematics , electrical engineering , bioinformatics , and computer science involving the quantification of information . Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. Since its inception it has broadened to find applications in many other areas, including statistical inference , natural language processing , cryptography , neurobiology , [ 1 ] the evolution [ 2 ] and function [ 3 ] of molecular codes, model selection [ 4 ] in ecology, thermal physics, [ 5 ] quantum computing , plagiarism detection [ 6 ] and other forms of data analysis . [ 7 ] A key measure of information is known as entropy , which is usually expressed by the average number of bits needed to store or communicate one symbol in a message. Entropy quantifies the uncertainty involved in predicting the value of a random variable .