Wednesday, June 22, 2016

Information Theory Overview

Comment: Originally published August 2014. Some time ago, I became interested in information theory partly due to my career and mostly because I began seeing elements of the theory popping up everywhere in movies, theological commentaries, war fighting etc... I studied the theory off and on purchasing books, watching movies, reading essays, and in general where ever I caught a wisp of the theory.  The interesting thing about truth is that it is self-evident and reveals itself in nature. So I did not have to look far. Although, a curious thing about information is noise or that which is distracting, like a red herring and there is plenty of noise out there. Anyhow, the point in this post is an information theory overview.  I would like to share basic information theory and relate it to the world around us. I will be coming back to this post updating and refining with more citations.

Information Theory
by
JT Bogden, PMP

Information theory is relatively new and is part of probability theory. Like the core disciplines of mathematics and the sciences, information theory has a physical origin with broad spectrum applications. The theory has captured the attention of researchers spawning hundreds of research papers since its inception during the late 1940's. This new discovery has generated interest in deeper research that involves biological systems, the genome, warfare, and many other topical arenas. Claude E. Shannon Ph.D. is the father of generalized information theory as developed during 1948 and theorized:

If the receiver is influenced by information from the transmitter then reproducing the influencing information at the receiver is limited to a probabilistic outcome based on entropy. 
Figure 1: Mathematical Formulation of Entropy (H) in a system
There are several terms in the thesis statement that may be difficult to grasp and the mathematical formulation, Figure 1,  may be overwhelming for some people who wonder how entropy and information linked.  Entropy is an operative concept behind diminishing returns or the rate at which a system dies, decays, or falls apart. Entropy operates under the order as formulated in figure 1. Thus, the phenomena is not random.  Within the context of information theory, entropy is the minimum size of a message before a meaning or value is lost. The notion of probabilistic outcomes involves multiple possible results in which each result has a degree of uncertainty or a possibility that the result may or may not occur. For example, a rolling of the dice is limited to only six possible outcomes or results. The probability of any one outcome occurring is 1 in 6. The uncertainty in rolling the dice is high being 5 to 6 that any specific outcome will not occur.  As for the mathematical formulation, I will just leave that for general knowledge of what it looks like.

The thesis is pointing towards a 'lossy' system and promotes a simplistic communication model, Figure 2.
Figure 2: Simple Information Theory Model
From the thesis, formula, and model more complex related theories and models spawn coupling information theory to biology, quantum physics, electronic communications, crowds, and many other topical subject matter.  All fall back on entropy or the smallest message before it looses its meaning. The big question is so what? We will explore the 'so what' in the next section.

Information Theory Around Us

Most people fail to realize that information theory impacts us on an everyday basis. Aspects of the theory appear in movies, underpin all biological sensory capabilities, and appear in information networks in many ways. Many people philosophize that human and natural existence is purely information based. Let us explore information theory as it is exposed to many people. Most people have some familiarity with the sciences at some level, movies,  and religion.  Let us begin with a survey the sciences.

Atom smashing experiments during the 1970's lead to the discovery that the universe has boundary limits. Physicist Richard Feynman, the father of quantum computing, concluded that matter ceases to exist at 10-32 meters. When matter ceases to exist so does space-time. Matter has dimensions and time's arrow spans dimensionality. When matter no longer exists neither does dimensionality and time is mutually inclusive. What remains are non-local waveforms or electromagnetic waves which are illustrated as strings that vibrate. The region where this occurs is the Planckian realm which is where matter is quantized or discrete having the qualities of a bit of information. Matter and energy are interchangeable based on the Theory of Relativity, Figure 3, and the wave-particle theory of light. Those vibrating waveforms in the Planckian realm slam together in a process of compactness that is not fully understood forming a particle having discrete size, weight, and possesses a positive (+),  neutral (0), or negative (-)  charge.   These particles then begin to assemble in a computational algorithmic manner based on the charge and tri-state logic into more complex particles from the sub-atomic into the physical realm. In the physical realm, complex molecules form such as DNA from which biological life emerges. 
Figure 3:  Theory of Relativity Formula
Energy = Mass x  (Speed of Light)2
The DNA is somewhat unique according to Microbiologist Dr. Matt Ridley. This is because not only did a computational information process arrive at the DNA molecule but injected into the DNA molecule are 4 bits of information ( G, C, A, and T ) which are used by nanites to endow biological life. Nanites are intelligent molecular machines that perform work and made out of amino acids and proteins. These molecular machines have claws, impellers, and other instruments. They communicate, travel, and perform work based on DNA informational instructions. The information process continues as even more information is applied to the DNA strand such as variation of timing, sequencing, and duration under which a gene fires. By varying the timing, sequencing, and duration of a firing gene  then specific features are managed on the life form under gestation.  Dr. Ridley quips the genome is not a blueprint for life but instead a pattern makers template having some sort of Genome Operating Device, a G.O.D (Ridley, 2003). The point here is that there is some sort of intelligent communication ongoing during the endowment of life and development of the natural universe. All of which are the outcome of computational processes and information.

During the 1960's extensive research was conducted into the operation of human biological sensory processes in relation to information theory.  The conclusion of this research determined that the sense of sight, sound, touch, smell, and taste undergoes an electrochemical process in which information is encoded using Fourier Transforms into electrical waveforms. The base Fourier equations are somewhat ominous, Figure 4.
Figure 4: Fourier Transforms Equations
The equations are shown to see what they look like. Extensive mathematical philosophy and practical understanding of how these equations perform is necessary to appreciate these equations. The lay jargon, Fourier transform equations encode and extract information embedded in a waveform.  These waveforms are constructed from the biological sensory devices; eyes, ears, nose, tongue, and skin. Once the information is encoded into a waveform the human brain stores information holographically.  Consider the operation of the eyes attached as part of the brain. The reason for two eyes is that they act symbiotically. One eye is a data source while the other eye acts as a reference source.  When the waveforms from these two sources collide then information is encoded in the constructive and destructive patterns that result. These patterns are then imprinted into the brain material to be recalled on demand as human's think in terms of images and not attributes. The human brain is capable of storing up to 6 terabytes of information. The eye has a curious tie to the quantum realm detecting a photon of light coincidental with the smallest instance of time, Planck's Time, which is of the order of 10-43 second.  This leads to the concept of quantum reality or that human perception is limited to the boundaries of the natural universe.

The human experience is said to be a digital simulation and the universe is computationally organized.  This lends credence to the creative license of writers and authors who imagine storylines such as The Matrix, Timeline, The Lawn Mower Man and countless others.

References:

Knuth, D. (2001). Things a computer scientist rarely talks about. CSLI Publications: Stanford.

Moser, S. and Chen, P. (2012). A student's guide to coding and information theory. Cambridge University Press: United Kingdom.

Reza, F. (1994). Introduction to information theory. Dover Books: New York.

Ridley, M. (2003). Nature via Nurture. New York: Harper Collins Publishers.

No comments :

Post a Comment