Theories of Information

Theories of Information: At different times, different people in different context propose various theories of information. Some of them are listed below:

a) Communication: The communication process requires at least three elements - source, message and destination.

i) Mathematical Theory: According to Shannon and Weaver the amount of information in a message is related to the size of the vocabulary. If one is restricted to “yes” or “no” then the recipient has the fifty percent chance of guessing correctly. If the vocabulary has ten signals then the recipient has less chance of guessing and so amount of information in a message is increased.

ii) Semantic Theory: Fairthorne proposed the “phlogiston theory of information”. According to this theory, information is something that can be squeezed out like water from a sponge. Information will obviously be affected by the prior state of knowledge of the recipient.

            As an alternative to the above theory Y. Barttiller and R. Carnap came out with semantic information theory. They suggested that a prior knowledge may increase information from a message (how precisely do the transmitted symbols convey the desired meaning).

iii) Information for Decision Making: M. C. Yovits proposed that “information is data of value to decision making”. Information involves reduction of uncertainty that is what the decision makers accept from the information system.

b) Commodity: Information is a commodity that is needed to do a job and so acquiring, storing and retrieving objects identified as information are important for our jobs and daily life.

i) Zipf’s Law:  George Kingsely Zipf proposes a relationship that exists between the frequency in the use of words and their distribution in books, reports, documents and other printed matter. The Zipf’s law was published in the book Psycho-biology of Language: An Introduction to Dynamic Philosophy, Cambridge, Mass MIT press, 1935. According to this law if the number of different words occurring once in a given sample is taken as X, the number of different words occurring twice, three times, four times, n times in the sample is respectively 1/2., 1/3, 1/4, 1/n of n upto, though not including, few most frequently used words.

ii) Bradfords Law: Samuel Clement Bradford in 1948 proposed another law. According to him if scientific journals are arranged in order of decreasing productivity of articles on a given subject they may be divided into a nucleus of periodicals, more particularly devoted to the subject and several groups of zones containing the same number of articles as the nucleus. When the number of periodicals in the nucleus and successive zones will be as 1:n:n2 where n=5, that is, the second zones has five times the number of journals of the first zone and the third zones has 52 or twenty five times the number of journals in the first or nucleus zone. This law was extended by many, notably B. C. Vikery, F. Leimkuhler and B. C. Brookes.

iii) Lotka’s Law: Alfred J. Lotka produced his law in 1926. Lotka’s law states that there is an exponential relationship between the number of items contributed to the literature and the total contribution by those who contributed two, three or more papers. Lotka developed a general formula for the relationship between the frequency of y persons making n contribution as xny=constant. Finding the value of constant when n=2, he observed that “the number of persons making 2 contribution is about one fourth of those making one (1/22), the number making 3 contribution is about (1/32) of those making one. The number making n contribution is bout 1/42 of those making one, etc.

iv) Law of Economics: According to the followers of this theory, information is analogous to energy (Meta energy). So it is also a resource that can be handled as a utility. It can be packaged, stored and distributed in various forms. In this sense information has value and the laws of economics can be applied to it.

c) State of Process: Information represents the state of an organism following the reception of energy from the environment in the form of a symbol or datum. Information reaches the highest known competence in the human being through the activities of the central nervous system and electronics devices such as computer extends these capabilities.

d) Cognitive Process: Much of the human behaviour can be seen as information processing. Thinking, memory, learning and perception are in fact the function of processing information. A. Turing proposed automata theory. Scholars have applied this theory to the study of behavior. According to these studies, information can be considered as a process intrinsic to all organism activities and can be replicated by machines.