We often associate the word ???information??? with both the facts and communication of facts. However, according to ???The Origins of Information Theory???, information in this sense has nothing to do with the concept of information as used in the information theory and computer science (Pierce, 1961). For example, Claude Shannon, having worked for Bell Telephone, formulated a classic concept of information that is actually an expression of the volume of information carried by a telephone network. The Shannon-Weaver Model (1947) proposes that all communication must include six elements, namely source, encoder, message, channel, decoder, and receiver. As Shannon was researching in the field of information theory, his model was initially very technology-oriented. The model was produced in 1949, and you will immediately see the similarity to the Lasswell Formula:
As you can see, the emphasis here is very much on the transmission and reception of information. The term ???information??? is understood rather differently from the way we would normally use it. Still, this model is often referred to as an ???information model??? of communication. Apart from its obvious technological bias, a drawback from our point of view is the model’s obvious linearity. It looks at communication as a one-way process. That is remedied by the addition of the feedback loop that you can see in the developed version of the model:
While it is fine for discussing the transformation of ???information???, when we try to apply the model to communication, problems arise with the assumption that meanings are somehow contained within the message.
Shannon thought that:
???the fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages???
Just as in most information theories and computer science, Shannon???s concept is purely technical and does not take into account the content or meaning of information. His interest was whether the character ???A??? sent is the same character ???A??? received. This is why his model is suitable for computers, whereby the binary ???1???s and ???0???s are all that defines messages. In deduction, while software ends up decoding the data received, it is entirely up to the person to interpret the information on screen.
This information would be subject to various other factors that could affect it???s meaning and such is not covered in the model. There are two entirely distinct levels of the message. One level is what is on the computer screen, be it words, numbers, images; such is information. The other level is how humans interpret what they see on the screens; such is knowledge. To a human being, an “A” sent and received may be interpreted as anything he deems reasonable. It sparks off a whole series of associations, depending on the context, earlier experiences and the emotional well being of the interpreter.
On the whole, I find it interesting that while we study information theory in order to improve data rates and accuracy in communication media (in order to reduce entropy / uncertainty), you still cannot escape the reality where one person’s interpretation is never the same as another’s. Even so, readings in the field of knowledge management have shown that setting up or creating a context (or providing metadata) for information helps the sender and receiver to frame the information that is sent or received. Such context can come in the form of a story (e.g. conversation), or through the use of multimedia such as images, videos and sounds. We might not ever share similar meanings, but I believe given the current state of technology (e.g. story-telling through blogs), we can get really close.
Pierce, J.R. (1961). The origins of information theory. A mathematical model. Encoding and binary digits. Symbols, signals, and noise: The nature and process of communication (pp. 19-77). New York: Harper & Row.
Wiener, N. (1961). On learning and self-reproducing machines. Cybernetics, or control and communication in the animal and the machine (pp. 169-180). 2nd ed. Cambridge: MIT Press.
Claude E. Shannon, ???A mathematical theory of communication,” Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October, 1948.
Claude E. Shannon and W. Weaver, ???The Mathematical Theory of Communication,??? University of Illinois Press, Urbana, Ill., 1949.