Wednesday, June 22, 2016

Organizational Computational Architecture

Commentary:  This brief was originally written in Sep 2012 and was updated reposting on Dec 21 2013. Many people think of computational capabilities as something that a box full of silicon and circuits on their desk performs. Moreover, they tend to think of computations in linear steps. Rarely does anyone think about the human mind-brain as a processor. If thought in this way, professionals and industry may be able to take advantage of the combination of a mind-brain and machine processing in interesting ways.

Organizational Computational Architecture
by 
JT Bogden, PMP

Organizational computational power is the cumulative processing capability of an organization that includes both machines and - humans?  Machine computational power is well accepted. Determining machine computational power is related in terms of instructions processed per unit time such as cycles per second or Hz, MHz, GHz, and Millions of Instructions per Second, MIPS. But human computational power is something of a different genre or is it?

Human computational power has been somewhat awkward to pin down as the study of the brain precedes computer and information science. Thus, the information science paradigms were never part of the early research. Historical attempts have centered on intelligence and emotional quotients. These are related to learning rates or adaptability rates of the mind-brain based on eight characteristic factors from the groundbreaking book 'Frames of Mind' by Howard Gardner. However, looking at the commonalities between the brain and machine base processing there are many interesting commonalities. The brain has up to 10 quadrillion synapses or transistor like switches and processes at rates up to about 1,680 GHz. The brain storage capacity varies from 1TB to 10TB or on average is about 5TB of information. The brian serves function as in circuits. The mind has been an enigma until the mind is thought of as software or possessing method. Arguably the character on the mind or the essence of the mind is the individual human being or in some circles - the soul. Thus, the mind-brain combination parallels the software-hardware paradigm. 

Diet and exercise are keys to optimizing performance of the brain. Brain food diets affect the function of the brain and brain exercises affects the method of the mind. For the purpose of this discussion, all human’s process information at an average equal rate with equal storage capacity. Therefore, human computational power in organizations can also be determined much like machine computational power using a mathematical formula based on how the human brains are related to each other resulting in a throughput metric for the organization.

Organizational Throughput

A sage organization seeks to understand the full computational power on hand then seeks to align and enhance that power according to the operations. Approaches such as knowledge management have been employed but often have very different meanings to different organizations. In general, knowledge management has become more or less a catalog of what is known and where to put or store the knowledge once it becomes known. Processing power, on the other hand, is the horsepower to solve problems and determine the knowledge. Thus, an operational view of processing power can be utilized to determine the horsepower on hand.

Elemental computational power is organized in various ways; parallel, series, arrayed, and/or distributed. Parallel processing improves throughput while processors in series improve dependent process performance. Arrayed processors are organized non-linearly in order to enhance complex decision-making performance. Distributed processing is designed to offload processing demand to localized processors conserving bandwidth, time, and load. An organization must determine how to organize not only the computers but the human staff in order to achieve optimal throughput also known as the operational tempo.

Ideally, an organization should have a virtual backplane against which everything sits. In the ethereal space of this backplane staff and systems can be organized to achieve optimized throughput and adapt to emergent conditions without disrupting the physical plane. In the virtual backplane, information exchanges occur and can be redirected and rearranged without causing disruptions in the physical realm. This has the effect of creating the stability necessary for humans to become self-actualizing and empowered. Humans who dynamically perform their work without interference are empowered. Self-actualization is the achievement of one's full potential through creativity, independence, spontaneity, and a grasp of the real world. In short, humans while acting like a processor are granted the freedom to utilize their talents in solving business problems.

Final Thoughts

Organizations have generally paid little attention to the human mind-brain in the past with the exception of human resource testing. This testing is generally not comprehensive and temporally bound to the hiring process. Human resources have generally sought people of specific character traits. For example, the US Postal exam is actual an IQ test in which they are seeking people who can perform repetitive tasks. Organization can benefit by better managing the human computational element.

Organizations should encourage better diets and offer brain exercises with incentives for heightened brain function. Employers should baseline then seek to improve staff brain function. Through tracked exercises and regular assessments, organizations can observe the computational posture over time. This should feed the organization's ability to cope with and manage emergent conditions or change which results in organizational experience.

The formation of experience starts with data. Data becomes information with context, and complex information relationships become knowledge. Wisdom is an outgrowth of experience and results in quantum leaps of judgment when there is missing information and/or knowledge. Management of transforming data into knowledge is knowledge management which in turn seeks to convey the organizational experience to individuals without retraining. The computational architecture in which data is transformed ultimately into knowledge is based on the organization of computational elements; both human and machine. Dynamically organized computational elements are ideal when loosely coupled in the system and are easily re-organized with emergent conditions.This construct facilitates experiences which can be tracked and recorded.

As the organization conducts routine business, the experiential history can act as a pattern match and alert humans to experiential conditions emerging again. Humans can then assess the circumstances and avoid making errant decisions again or attempt an alternate solution to avoid a recursive trap that cost time and money. Two episodes of Star Trek lend well to this discussion; Cause and Effect and Time Squared.

In Cause and Effect, the Enterprise is trapped in a recursive time loop. For each iteration of the loop, the crew recalls more about the experience until they finally recall enough to break the loop. It takes them 17 days to break the pattern. In Time Squared, The Enterprise is destroyed 6 hours into the future and Captain Picard is cast back in time and out of phase. The Enterprise recursively experiences the same event until Captain Picard broke the loop by preventing his temporally displaced self from completing the recursive loop. He injected an alternative path in the form of surprise. In both episodes, the common theme of repeating the same experience occurred. However, in one episode a little something was retained until enough was learned to break the recursive experience. In the other episode little was retained but through evidence and observation a decision was made to choose an alternate path and prevent a repeat.

What if through a combination of machine and human processing recurrent events would not repeat in an organization. Classically, governmental organizations cycle people every 3 years and the same issues different people repeat every 3 years.  History repeats itself unless the historical experience is known and recalled somehow. Perhaps machine and human computational systems can provide the experience and decision making combined if designed to recognize and act on experience. If not, create the circumstances in which surprise is possible breaking recurring trends.

References:

Gardner, H (1993). Frames of mind: the theory of multiple intelligences. (10th e.d.). Basic Books. New York.

Information Theory Overview

Comment: Originally published August 2014. Some time ago, I became interested in information theory partly due to my career and mostly because I began seeing elements of the theory popping up everywhere in movies, theological commentaries, war fighting etc... I studied the theory off and on purchasing books, watching movies, reading essays, and in general where ever I caught a wisp of the theory.  The interesting thing about truth is that it is self-evident and reveals itself in nature. So I did not have to look far. Although, a curious thing about information is noise or that which is distracting, like a red herring and there is plenty of noise out there. Anyhow, the point in this post is an information theory overview.  I would like to share basic information theory and relate it to the world around us. I will be coming back to this post updating and refining with more citations.

Information Theory
by
JT Bogden, PMP

Information theory is relatively new and is part of probability theory. Like the core disciplines of mathematics and the sciences, information theory has a physical origin with broad spectrum applications. The theory has captured the attention of researchers spawning hundreds of research papers since its inception during the late 1940's. This new discovery has generated interest in deeper research that involves biological systems, the genome, warfare, and many other topical arenas. Claude E. Shannon Ph.D. is the father of generalized information theory as developed during 1948 and theorized:

If the receiver is influenced by information from the transmitter then reproducing the influencing information at the receiver is limited to a probabilistic outcome based on entropy. 
Figure 1: Mathematical Formulation of Entropy (H) in a system
There are several terms in the thesis statement that may be difficult to grasp and the mathematical formulation, Figure 1,  may be overwhelming for some people who wonder how entropy and information linked.  Entropy is an operative concept behind diminishing returns or the rate at which a system dies, decays, or falls apart. Entropy operates under the order as formulated in figure 1. Thus, the phenomena is not random.  Within the context of information theory, entropy is the minimum size of a message before a meaning or value is lost. The notion of probabilistic outcomes involves multiple possible results in which each result has a degree of uncertainty or a possibility that the result may or may not occur. For example, a rolling of the dice is limited to only six possible outcomes or results. The probability of any one outcome occurring is 1 in 6. The uncertainty in rolling the dice is high being 5 to 6 that any specific outcome will not occur.  As for the mathematical formulation, I will just leave that for general knowledge of what it looks like.

The thesis is pointing towards a 'lossy' system and promotes a simplistic communication model, Figure 2.
Figure 2: Simple Information Theory Model
From the thesis, formula, and model more complex related theories and models spawn coupling information theory to biology, quantum physics, electronic communications, crowds, and many other topical subject matter.  All fall back on entropy or the smallest message before it looses its meaning. The big question is so what? We will explore the 'so what' in the next section.

Information Theory Around Us

Most people fail to realize that information theory impacts us on an everyday basis. Aspects of the theory appear in movies, underpin all biological sensory capabilities, and appear in information networks in many ways. Many people philosophize that human and natural existence is purely information based. Let us explore information theory as it is exposed to many people. Most people have some familiarity with the sciences at some level, movies,  and religion.  Let us begin with a survey the sciences.

Atom smashing experiments during the 1970's lead to the discovery that the universe has boundary limits. Physicist Richard Feynman, the father of quantum computing, concluded that matter ceases to exist at 10-32 meters. When matter ceases to exist so does space-time. Matter has dimensions and time's arrow spans dimensionality. When matter no longer exists neither does dimensionality and time is mutually inclusive. What remains are non-local waveforms or electromagnetic waves which are illustrated as strings that vibrate. The region where this occurs is the Planckian realm which is where matter is quantized or discrete having the qualities of a bit of information. Matter and energy are interchangeable based on the Theory of Relativity, Figure 3, and the wave-particle theory of light. Those vibrating waveforms in the Planckian realm slam together in a process of compactness that is not fully understood forming a particle having discrete size, weight, and possesses a positive (+),  neutral (0), or negative (-)  charge.   These particles then begin to assemble in a computational algorithmic manner based on the charge and tri-state logic into more complex particles from the sub-atomic into the physical realm. In the physical realm, complex molecules form such as DNA from which biological life emerges. 
Figure 3:  Theory of Relativity Formula
Energy = Mass x  (Speed of Light)2
The DNA is somewhat unique according to Microbiologist Dr. Matt Ridley. This is because not only did a computational information process arrive at the DNA molecule but injected into the DNA molecule are 4 bits of information ( G, C, A, and T ) which are used by nanites to endow biological life. Nanites are intelligent molecular machines that perform work and made out of amino acids and proteins. These molecular machines have claws, impellers, and other instruments. They communicate, travel, and perform work based on DNA informational instructions. The information process continues as even more information is applied to the DNA strand such as variation of timing, sequencing, and duration under which a gene fires. By varying the timing, sequencing, and duration of a firing gene  then specific features are managed on the life form under gestation.  Dr. Ridley quips the genome is not a blueprint for life but instead a pattern makers template having some sort of Genome Operating Device, a G.O.D (Ridley, 2003). The point here is that there is some sort of intelligent communication ongoing during the endowment of life and development of the natural universe. All of which are the outcome of computational processes and information.

During the 1960's extensive research was conducted into the operation of human biological sensory processes in relation to information theory.  The conclusion of this research determined that the sense of sight, sound, touch, smell, and taste undergoes an electrochemical process in which information is encoded using Fourier Transforms into electrical waveforms. The base Fourier equations are somewhat ominous, Figure 4.
Figure 4: Fourier Transforms Equations
The equations are shown to see what they look like. Extensive mathematical philosophy and practical understanding of how these equations perform is necessary to appreciate these equations. The lay jargon, Fourier transform equations encode and extract information embedded in a waveform.  These waveforms are constructed from the biological sensory devices; eyes, ears, nose, tongue, and skin. Once the information is encoded into a waveform the human brain stores information holographically.  Consider the operation of the eyes attached as part of the brain. The reason for two eyes is that they act symbiotically. One eye is a data source while the other eye acts as a reference source.  When the waveforms from these two sources collide then information is encoded in the constructive and destructive patterns that result. These patterns are then imprinted into the brain material to be recalled on demand as human's think in terms of images and not attributes. The human brain is capable of storing up to 6 terabytes of information. The eye has a curious tie to the quantum realm detecting a photon of light coincidental with the smallest instance of time, Planck's Time, which is of the order of 10-43 second.  This leads to the concept of quantum reality or that human perception is limited to the boundaries of the natural universe.

The human experience is said to be a digital simulation and the universe is computationally organized.  This lends credence to the creative license of writers and authors who imagine storylines such as The Matrix, Timeline, The Lawn Mower Man and countless others.

References:

Knuth, D. (2001). Things a computer scientist rarely talks about. CSLI Publications: Stanford.

Moser, S. and Chen, P. (2012). A student's guide to coding and information theory. Cambridge University Press: United Kingdom.

Reza, F. (1994). Introduction to information theory. Dover Books: New York.

Ridley, M. (2003). Nature via Nurture. New York: Harper Collins Publishers.