Information theory: Difference between revisions
imported>Hendra I. Nurdin mNo edit summary |
mNo edit summary |
||
(2 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
{{subpages}} | {{subpages}} | ||
'''Information theory''' was started in the 1940s when mathematicians were seeking to understand the fundamentals of information in the sense that historical figures sought to understand the nature of matter and energy. In the attempt to understand and define the basis for understanding the nature of information and its place in the universe, two different key approaches were taken; the syntactic approach of Shannon, and the semantic approach of Barr-Hillel Karnap. | '''Information theory''' was started in the 1940s when mathematicians were seeking to understand the fundamentals of information in the sense that historical figures sought to understand the nature of matter and energy. In the attempt to understand and define the basis for understanding the nature of information and its place in the universe, two different key approaches were taken; the syntactic approach of [[Claude Shannon]], and the semantic approach of Barr-Hillel Karnap. | ||
The Shannon approach was based on the fundamental notion that sequences of symbols had more content in the communication between parties if they were more rare. This ultimately produced mathematical characterizations of information as the inverse of randomness, also called entropy. Shannon's mathematical foundation consisted of a number that could be calculated for the information content of a sequence of symbols in a language given the frequencies of the use of symbols within that language. In particular, | The Shannon approach was based on the fundamental notion that sequences of symbols had more content in the communication between parties if they were more rare. This ultimately produced mathematical characterizations of information as the inverse of randomness, also called entropy. Shannon's mathematical foundation consisted of a number that could be calculated for the information content of a sequence of symbols in a language given the frequencies of the use of symbols within that language. In particular, <math>\scriptstyle H=K\sum_{i}P(i) ln(i)</math> for all symbols ''i'' in the language. This forms the basis for most of the efficient coding approaches to communications and for most of the historical and current compression algorithms and approaches as well as setting the theoretical limits for compression based on the selection of symbol sets. | ||
<math>H=K\sum_{i}P(i) | |||
The other approach to information theory was that taken by Bar-Hillel and Carnap, who sought but never found a semantic theory of information. In essence, a semantic theory is a theory of meaning by which information can be associated with cognitive symbols with utility. For example, in a semantic theory, we should be able to compute the meaning of a sentence in a systematic manner, and presumably also do translation between languages through a commensurable common language - the semantic theory language. Today, many researchers are working on this issue, and perhaps some day we will achieve this capability for some set of human languages, however; it may be too ambitious to achieve this for the general cases of being able to systematical understand all languages in all forms. | The other approach to information theory was that taken by Bar-Hillel and Carnap, who sought but never found a semantic theory of information. In essence, a semantic theory is a theory of meaning by which information can be associated with cognitive symbols with utility. For example, in a semantic theory, we should be able to compute the meaning of a sentence in a systematic manner, and presumably also do translation between languages through a commensurable common language - the semantic theory language. Today, many researchers are working on this issue, and perhaps some day we will achieve this capability for some set of human languages, however; it may be too ambitious to achieve this for the general cases of being able to systematical understand all languages in all forms. | ||
==Seminal paper== | ==Seminal paper== | ||
C. E. Shannon, A Mathematical Theory of Communications, ''Bell Systems Technical Journal''. 3(27), July 1948. [http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf] | C. E. Shannon, A Mathematical Theory of Communications, ''Bell Systems Technical Journal''. 3(27), July 1948. [http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf][[Category:Suggestion Bot Tag]] |
Latest revision as of 11:00, 1 September 2024
Information theory was started in the 1940s when mathematicians were seeking to understand the fundamentals of information in the sense that historical figures sought to understand the nature of matter and energy. In the attempt to understand and define the basis for understanding the nature of information and its place in the universe, two different key approaches were taken; the syntactic approach of Claude Shannon, and the semantic approach of Barr-Hillel Karnap.
The Shannon approach was based on the fundamental notion that sequences of symbols had more content in the communication between parties if they were more rare. This ultimately produced mathematical characterizations of information as the inverse of randomness, also called entropy. Shannon's mathematical foundation consisted of a number that could be calculated for the information content of a sequence of symbols in a language given the frequencies of the use of symbols within that language. In particular, for all symbols i in the language. This forms the basis for most of the efficient coding approaches to communications and for most of the historical and current compression algorithms and approaches as well as setting the theoretical limits for compression based on the selection of symbol sets.
The other approach to information theory was that taken by Bar-Hillel and Carnap, who sought but never found a semantic theory of information. In essence, a semantic theory is a theory of meaning by which information can be associated with cognitive symbols with utility. For example, in a semantic theory, we should be able to compute the meaning of a sentence in a systematic manner, and presumably also do translation between languages through a commensurable common language - the semantic theory language. Today, many researchers are working on this issue, and perhaps some day we will achieve this capability for some set of human languages, however; it may be too ambitious to achieve this for the general cases of being able to systematical understand all languages in all forms.
Seminal paper
C. E. Shannon, A Mathematical Theory of Communications, Bell Systems Technical Journal. 3(27), July 1948. [1]