Wednesday, July 20, 2016

Using Artificial Intelligence In Business

Comment:  This brief was originally written in Oct 2009 and was updated then posted again on December 21, 2013. Many movies and books have dramatized the use of Artificial Intelligence, AI, delivering social messages of various sorts. However, the use AI methods and techniques are often overlooked. This post will discuss AI basics. I may expand this post to a series on AI since the world has entered a time that involves increasing use of AI as well as biological and quantum computing among other emerging technologies. My estimation is that many technologies will be converging rapidly in the near term. Ray Kurzweil, famed author of the book "The Age of Spiritual Machines", speaks to a technological singularity that he believes will occur in the next 10 years. The technological singularity he discusses is the convergence of principally three technologies that result in machines and/or  humanoids having greater-than-human intelligence. That technology field is call trans-humanism, Image 1.


Image 1: Trans-humanism Symbol
Artificial Intelligence In Business
by 
JT Bogden, PMP

When most people think of AI they think of an all-knowing computer system. However, realistic designs must be formally constrained or have boundary limits set due to processing power, memory, and temporal limitations in traditional computational systems.  Also, decision-making is often multi-layered having some sort of fail-safe or fall back. Hence, there is a simplicity to the design that is wrapped around a Theory of Mind, ToM, which is a humanization of the system. Ultimately, the system interacts and behaves in an intelligent human-centric manner.

An AI engine, Figure 1, is object oriented and receives inputs from its environment or other sources such as a database of the project or operational data. Any AI engine is limited in the ability to make inferences by the information delivered to the inputs from sensory devices and other input equipage. Thus, as part of the design that constraints the data requirements need to be carefully considered.
Figure 1:  AI Engine Model
In coding the elements of the engine a concept of the finite-state machine, FSM, is used as an organizational tool to break the problem set into manageable sub-problems. An FSM has a data structure reflective of the states inherent to the machine, input conditions, and transition functions. Another type of machine used is the Fuzzy-State machine, FuSM's, that handle partial truths. FuSM's do not maintain a defined state but instead compute activation levels and the overall state is determined by the combination of activated states. Skeletal code wraps the object classes of FSM's and FuSM's into a management system unique to the engine. FSM's and FuSM's can be a message, event, data, or inertial driven. State machines yield flexibility and scalability to the intelligence system and are more complex than this brief paragraph.

Another framework is an algorithmic model. In the computational theory of the universe, the natural universe is an irreversible algorithmic computation reflecting times arrow. Algorithmic approaches are useful  for rule and probabilistic based processing having limited sets of outcomes such as the rolling of a dice. There are only 6 possible outcomes using 1 dice. The algorithm is a probabilistic outcome of the roll that one of 6 sides will show. 

The ultimate AI framework is the neural network (a neural net). Natural neural nets are  the structure of animal and human brains. AI reflects the brain activity using input, hidden, and output nodes to process information. Input nodes gather information with some basic processing. Thus, an input node can be some sort of sensory device or even a neugent or agent of some sort. Hidden networked nodes do more advanced processing routing the processed information forming line or pathways of logic. Output nodes usually format the processed information or result in some sort of action. Output nodes can be displays, motors, servers, actuators, voice, or a host of other devices. A node is said to fire when inputs match the states and circumstances for processing. Outputs from several nodes could be the inputs to another node or set of nodes. The complex pathways through a neural net that result in an output are a line of logic.  A valid line of logic is said to be a truth. Whereas, an invalid line of logic is considered to be untrue.  The Neural net  has their place in complex real-time problem solving situations as the nodal network conducts parallel processing while evaluating multiple outcomes simultaneously.

Using these techniques in an operation to compute outcomes and discern knowledge is exceptionally useful in marketplace competition.  Using a complex adaptive architecture the organizational structure can form a neural net processing knowledge and lines of logic if properly designed. In this case, the World Wide Web becomes a medium for processing lines of logic across geospatially dispersed nodes. Some nodes can be human supervised while other nodes are automated. In essence, the entire organization becomes a brain exhibiting life-like qualities.

AI constructs can be employed in projects to monitor for emerging outcomes and assist by algorithmically computing outcomes then correcting for issues before they emerge. Data feeds into the AI engine can monitor schedules and resources as well as change requests for adherence to the project portfolio. AI is particularly useful in identifying emerging issues or partial truths before they become real issues.

AI will not become fully useful to organizations and business until they have evolved and formed structures supportive of AI.  If organizations adopt constructs that reflect the natural environment rather than a human imposed structures only then will they be able to adapt to change and responsibly leverage emerging conditions with greater simplicity. 

Wednesday, June 22, 2016

Organizational Computational Architecture

Commentary:  This brief was originally written in Sep 2012 and was updated reposting on Dec 21 2013. Many people think of computational capabilities as something that a box full of silicon and circuits on their desk performs. Moreover, they tend to think of computations in linear steps. Rarely does anyone think about the human mind-brain as a processor. If thought in this way, professionals and industry may be able to take advantage of the combination of a mind-brain and machine processing in interesting ways.

Organizational Computational Architecture
by 
JT Bogden, PMP

Organizational computational power is the cumulative processing capability of an organization that includes both machines and - humans?  Machine computational power is well accepted. Determining machine computational power is related in terms of instructions processed per unit time such as cycles per second or Hz, MHz, GHz, and Millions of Instructions per Second, MIPS. But human computational power is something of a different genre or is it?

Human computational power has been somewhat awkward to pin down as the study of the brain precedes computer and information science. Thus, the information science paradigms were never part of the early research. Historical attempts have centered on intelligence and emotional quotients. These are related to learning rates or adaptability rates of the mind-brain based on eight characteristic factors from the groundbreaking book 'Frames of Mind' by Howard Gardner. However, looking at the commonalities between the brain and machine base processing there are many interesting commonalities. The brain has up to 10 quadrillion synapses or transistor like switches and processes at rates up to about 1,680 GHz. The brain storage capacity varies from 1TB to 10TB or on average is about 5TB of information. The brian serves function as in circuits. The mind has been an enigma until the mind is thought of as software or possessing method. Arguably the character on the mind or the essence of the mind is the individual human being or in some circles - the soul. Thus, the mind-brain combination parallels the software-hardware paradigm. 

Diet and exercise are keys to optimizing performance of the brain. Brain food diets affect the function of the brain and brain exercises affects the method of the mind. For the purpose of this discussion, all human’s process information at an average equal rate with equal storage capacity. Therefore, human computational power in organizations can also be determined much like machine computational power using a mathematical formula based on how the human brains are related to each other resulting in a throughput metric for the organization.

Organizational Throughput

A sage organization seeks to understand the full computational power on hand then seeks to align and enhance that power according to the operations. Approaches such as knowledge management have been employed but often have very different meanings to different organizations. In general, knowledge management has become more or less a catalog of what is known and where to put or store the knowledge once it becomes known. Processing power, on the other hand, is the horsepower to solve problems and determine the knowledge. Thus, an operational view of processing power can be utilized to determine the horsepower on hand.

Elemental computational power is organized in various ways; parallel, series, arrayed, and/or distributed. Parallel processing improves throughput while processors in series improve dependent process performance. Arrayed processors are organized non-linearly in order to enhance complex decision-making performance. Distributed processing is designed to offload processing demand to localized processors conserving bandwidth, time, and load. An organization must determine how to organize not only the computers but the human staff in order to achieve optimal throughput also known as the operational tempo.

Ideally, an organization should have a virtual backplane against which everything sits. In the ethereal space of this backplane staff and systems can be organized to achieve optimized throughput and adapt to emergent conditions without disrupting the physical plane. In the virtual backplane, information exchanges occur and can be redirected and rearranged without causing disruptions in the physical realm. This has the effect of creating the stability necessary for humans to become self-actualizing and empowered. Humans who dynamically perform their work without interference are empowered. Self-actualization is the achievement of one's full potential through creativity, independence, spontaneity, and a grasp of the real world. In short, humans while acting like a processor are granted the freedom to utilize their talents in solving business problems.

Final Thoughts

Organizations have generally paid little attention to the human mind-brain in the past with the exception of human resource testing. This testing is generally not comprehensive and temporally bound to the hiring process. Human resources have generally sought people of specific character traits. For example, the US Postal exam is actual an IQ test in which they are seeking people who can perform repetitive tasks. Organization can benefit by better managing the human computational element.

Organizations should encourage better diets and offer brain exercises with incentives for heightened brain function. Employers should baseline then seek to improve staff brain function. Through tracked exercises and regular assessments, organizations can observe the computational posture over time. This should feed the organization's ability to cope with and manage emergent conditions or change which results in organizational experience.

The formation of experience starts with data. Data becomes information with context, and complex information relationships become knowledge. Wisdom is an outgrowth of experience and results in quantum leaps of judgment when there is missing information and/or knowledge. Management of transforming data into knowledge is knowledge management which in turn seeks to convey the organizational experience to individuals without retraining. The computational architecture in which data is transformed ultimately into knowledge is based on the organization of computational elements; both human and machine. Dynamically organized computational elements are ideal when loosely coupled in the system and are easily re-organized with emergent conditions.This construct facilitates experiences which can be tracked and recorded.

As the organization conducts routine business, the experiential history can act as a pattern match and alert humans to experiential conditions emerging again. Humans can then assess the circumstances and avoid making errant decisions again or attempt an alternate solution to avoid a recursive trap that cost time and money. Two episodes of Star Trek lend well to this discussion; Cause and Effect and Time Squared.

In Cause and Effect, the Enterprise is trapped in a recursive time loop. For each iteration of the loop, the crew recalls more about the experience until they finally recall enough to break the loop. It takes them 17 days to break the pattern. In Time Squared, The Enterprise is destroyed 6 hours into the future and Captain Picard is cast back in time and out of phase. The Enterprise recursively experiences the same event until Captain Picard broke the loop by preventing his temporally displaced self from completing the recursive loop. He injected an alternative path in the form of surprise. In both episodes, the common theme of repeating the same experience occurred. However, in one episode a little something was retained until enough was learned to break the recursive experience. In the other episode little was retained but through evidence and observation a decision was made to choose an alternate path and prevent a repeat.

What if through a combination of machine and human processing recurrent events would not repeat in an organization. Classically, governmental organizations cycle people every 3 years and the same issues different people repeat every 3 years.  History repeats itself unless the historical experience is known and recalled somehow. Perhaps machine and human computational systems can provide the experience and decision making combined if designed to recognize and act on experience. If not, create the circumstances in which surprise is possible breaking recurring trends.

References:

Gardner, H (1993). Frames of mind: the theory of multiple intelligences. (10th e.d.). Basic Books. New York.

Information Theory Overview

Comment: Originally published August 2014. Some time ago, I became interested in information theory partly due to my career and mostly because I began seeing elements of the theory popping up everywhere in movies, theological commentaries, war fighting etc... I studied the theory off and on purchasing books, watching movies, reading essays, and in general where ever I caught a wisp of the theory.  The interesting thing about truth is that it is self-evident and reveals itself in nature. So I did not have to look far. Although, a curious thing about information is noise or that which is distracting, like a red herring and there is plenty of noise out there. Anyhow, the point in this post is an information theory overview.  I would like to share basic information theory and relate it to the world around us. I will be coming back to this post updating and refining with more citations.

Information Theory
by
JT Bogden, PMP

Information theory is relatively new and is part of probability theory. Like the core disciplines of mathematics and the sciences, information theory has a physical origin with broad spectrum applications. The theory has captured the attention of researchers spawning hundreds of research papers since its inception during the late 1940's. This new discovery has generated interest in deeper research that involves biological systems, the genome, warfare, and many other topical arenas. Claude E. Shannon Ph.D. is the father of generalized information theory as developed during 1948 and theorized:

If the receiver is influenced by information from the transmitter then reproducing the influencing information at the receiver is limited to a probabilistic outcome based on entropy. 
Figure 1: Mathematical Formulation of Entropy (H) in a system
There are several terms in the thesis statement that may be difficult to grasp and the mathematical formulation, Figure 1,  may be overwhelming for some people who wonder how entropy and information linked.  Entropy is an operative concept behind diminishing returns or the rate at which a system dies, decays, or falls apart. Entropy operates under the order as formulated in figure 1. Thus, the phenomena is not random.  Within the context of information theory, entropy is the minimum size of a message before a meaning or value is lost. The notion of probabilistic outcomes involves multiple possible results in which each result has a degree of uncertainty or a possibility that the result may or may not occur. For example, a rolling of the dice is limited to only six possible outcomes or results. The probability of any one outcome occurring is 1 in 6. The uncertainty in rolling the dice is high being 5 to 6 that any specific outcome will not occur.  As for the mathematical formulation, I will just leave that for general knowledge of what it looks like.

The thesis is pointing towards a 'lossy' system and promotes a simplistic communication model, Figure 2.
Figure 2: Simple Information Theory Model
From the thesis, formula, and model more complex related theories and models spawn coupling information theory to biology, quantum physics, electronic communications, crowds, and many other topical subject matter.  All fall back on entropy or the smallest message before it looses its meaning. The big question is so what? We will explore the 'so what' in the next section.

Information Theory Around Us

Most people fail to realize that information theory impacts us on an everyday basis. Aspects of the theory appear in movies, underpin all biological sensory capabilities, and appear in information networks in many ways. Many people philosophize that human and natural existence is purely information based. Let us explore information theory as it is exposed to many people. Most people have some familiarity with the sciences at some level, movies,  and religion.  Let us begin with a survey the sciences.

Atom smashing experiments during the 1970's lead to the discovery that the universe has boundary limits. Physicist Richard Feynman, the father of quantum computing, concluded that matter ceases to exist at 10-32 meters. When matter ceases to exist so does space-time. Matter has dimensions and time's arrow spans dimensionality. When matter no longer exists neither does dimensionality and time is mutually inclusive. What remains are non-local waveforms or electromagnetic waves which are illustrated as strings that vibrate. The region where this occurs is the Planckian realm which is where matter is quantized or discrete having the qualities of a bit of information. Matter and energy are interchangeable based on the Theory of Relativity, Figure 3, and the wave-particle theory of light. Those vibrating waveforms in the Planckian realm slam together in a process of compactness that is not fully understood forming a particle having discrete size, weight, and possesses a positive (+),  neutral (0), or negative (-)  charge.   These particles then begin to assemble in a computational algorithmic manner based on the charge and tri-state logic into more complex particles from the sub-atomic into the physical realm. In the physical realm, complex molecules form such as DNA from which biological life emerges. 
Figure 3:  Theory of Relativity Formula
Energy = Mass x  (Speed of Light)2
The DNA is somewhat unique according to Microbiologist Dr. Matt Ridley. This is because not only did a computational information process arrive at the DNA molecule but injected into the DNA molecule are 4 bits of information ( G, C, A, and T ) which are used by nanites to endow biological life. Nanites are intelligent molecular machines that perform work and made out of amino acids and proteins. These molecular machines have claws, impellers, and other instruments. They communicate, travel, and perform work based on DNA informational instructions. The information process continues as even more information is applied to the DNA strand such as variation of timing, sequencing, and duration under which a gene fires. By varying the timing, sequencing, and duration of a firing gene  then specific features are managed on the life form under gestation.  Dr. Ridley quips the genome is not a blueprint for life but instead a pattern makers template having some sort of Genome Operating Device, a G.O.D (Ridley, 2003). The point here is that there is some sort of intelligent communication ongoing during the endowment of life and development of the natural universe. All of which are the outcome of computational processes and information.

During the 1960's extensive research was conducted into the operation of human biological sensory processes in relation to information theory.  The conclusion of this research determined that the sense of sight, sound, touch, smell, and taste undergoes an electrochemical process in which information is encoded using Fourier Transforms into electrical waveforms. The base Fourier equations are somewhat ominous, Figure 4.
Figure 4: Fourier Transforms Equations
The equations are shown to see what they look like. Extensive mathematical philosophy and practical understanding of how these equations perform is necessary to appreciate these equations. The lay jargon, Fourier transform equations encode and extract information embedded in a waveform.  These waveforms are constructed from the biological sensory devices; eyes, ears, nose, tongue, and skin. Once the information is encoded into a waveform the human brain stores information holographically.  Consider the operation of the eyes attached as part of the brain. The reason for two eyes is that they act symbiotically. One eye is a data source while the other eye acts as a reference source.  When the waveforms from these two sources collide then information is encoded in the constructive and destructive patterns that result. These patterns are then imprinted into the brain material to be recalled on demand as human's think in terms of images and not attributes. The human brain is capable of storing up to 6 terabytes of information. The eye has a curious tie to the quantum realm detecting a photon of light coincidental with the smallest instance of time, Planck's Time, which is of the order of 10-43 second.  This leads to the concept of quantum reality or that human perception is limited to the boundaries of the natural universe.

The human experience is said to be a digital simulation and the universe is computationally organized.  This lends credence to the creative license of writers and authors who imagine storylines such as The Matrix, Timeline, The Lawn Mower Man and countless others.

References:

Knuth, D. (2001). Things a computer scientist rarely talks about. CSLI Publications: Stanford.

Moser, S. and Chen, P. (2012). A student's guide to coding and information theory. Cambridge University Press: United Kingdom.

Reza, F. (1994). Introduction to information theory. Dover Books: New York.

Ridley, M. (2003). Nature via Nurture. New York: Harper Collins Publishers.

Wednesday, May 18, 2016

Spiritual Machines Create Challenges for Project Managers

Comment: This post was originally written August 2014. I have made a few updates and posted again May 2016. What I am going to talk about originated as a discussion from my Masters in Information Technology  program. This  may seem far fetched to many people but is an upcoming debate in the not-so-distant future. Holographic technologies have the potential to cause moral dilemmas for project managers who must implement these systems when they arrive. The early technology will be inanimate and mechanical in nature. As time passes this technology will combine with neural nets and biological computing to create life-like machines that could potentially develop self-awareness. It is never too early to debate the questions and challenges these systems pose.

Spiritual Machines Create Challenges for Project Managers
by
JT Bogden, PMP

Holography was commercially exploited as early as the 1960’s with the GAF viewfinder. As a young boy, I recall placing reels with images into a stereographic view finder looking at the comic book world of Snoopy and other stories of dinosaurs. Later, I explored holography deeper in technical books learning about how data is encoded in the collision patterns between reference and data beams. Science philosophy books explored the holographic universe and how the human eye-brain organ is a holographic system that interprets our world.

Scientists have struggled with the eye-brain to mind dilemma in humans. The brain is the mechanical operation while the mind is spiritual in character. Holographic systems store information in terms of ghostly images unlike conventional storage systems that store information in terms of attributes. According to Michael Talbot’s book “The Holographic Universe” holography’s ethereal images reflect the way the human mind processes reality. The human brain can suffer trauma loosing large areas of tissue but somehow retains unfettered memories and even character. Likewise, a curious quality of holography is that all the information is stored ubiquitously throughout the storage medium defeating divisibility short of catastrophic loss. Any divisible piece contains the complete information set. (Talbot, 1991) Thus, holography has the appearance of retaining the character or essence of the information stored despite failures and imperfections of where the data is embodied.

Current robotic research is developing systems that mimic human sensory and motor capabilities. Software and processing hardware emulates or mimics human neural circuitry to cause human-like actions including those emotional or to make human-like decisions. Both actions are mechanical in character operating based on local action. For example, tracking and catching a baseball in flight or if the baseball hits the robot instead to perform specific emotional responses. The elements of surprise and creativity are more or less spiritual in character and have not yet been mastered by science since they are not local actions that science deals with.  For example, reflecting on the flight of the baseball and describing it as screaming through the air is creative and not a local actions. In fact, self-awareness maybe a requirement to achieve surprise and creativity.

Holography's creates theological concerns since its resilient retention of information is not mechanical. Instead, holographic data storage is based on waveforms or electromagnetic energy patterns also known as light waves. These are often equated to spirituality. There are theological implications for example from the Judeo-Christian Bible makes parallels between light and the absence of light to spiritual existence. For example, in the Bible, Genesis 1.4; "God saw that the light was good, and he separated the light from the darkness.” Holographic ghostly images in storage and computational processing could depart silicon wafers and mechanical storage systems for the amino acids and proteins found in biological processing. Human tinkering could result in challenges by truly spiritual machines. If not careful these biological machines could develop a conscience and become annoyed with natural biological computers also known as humans. In the end, mankind’s technological conduct could potentially manufacture a nemesis. If for all the good in the world there is evil then the human responsibility is to dispense the good and forsake the evil. Holographic storage is the beginning of a computational era that has the potential to elevate or degrade mankind.

"The development of every new technology raises questions that we, as individuals and as a society, need to address. Will this technology help me become the person that I want to be? Or that I should be? How will it affect the principle of treating everyone in our society fairly? Does the technology help our community advance our shared values?" (Shanks, 2005).

The possibility of computational systems not based on silicone but amino acids and proteins, the building blocks of life, is clearly on the horizon and presents some puzzling questions. As these systems advance, project managers implementing these new systems could be faced with significant ethical and moral decisions. Literally, actions such as killing the 'power' on a living machine raises questions about life and the right to exist.  Will man-made biological computers perhaps through genetic engineering develop self-awareness, spirituality, and a moral code of their own? How far will this go? What other moral and ethical issues could arise from the advent of this technology?

Please feel free to comment. I would enjoy hearing from you.

References:

Lewis, C.S., August 2002. The Four Loves, Houghton Mifflin Harcourt, ISBN: 9780156329309

Englander, I. (2003). The Architecture of Computer Hardware and Systems Software: An information Technology Approach. (3rd ed.). New York: John Wiley & Sons Inc.

Kurzweil, Ray, 1999. “The Age of Spiritual Machines: When Computers Exceed Human Intelligence”, Penguin Books, ISBN: 97801402822023

Shanks, Tom, 2005. Machines and Man: Ethics and Robotics in the 21st Century, The Tech Museum of Innovation Website. Retrieved 21FEB09 from
http://www.thetech.org/exhibits/online/robotics/ethics/index.html

Talbot, Michael, January 1991. The Holographic Universe, Harper Collins Publishers, ISBN 9780060922580

Wednesday, April 20, 2016

Human Centric Computing

Commentary: This post was originally written Dec 2010 and was updated then reposted Dec2013. Chief Executive Officers fear commoditization of their products and services. This is a major indicator of attenuating profit margins and a mature market. Efforts such as "new and improved" are short term efforts to extend a product life cycle. A better solution is to resource disruptive technologies that cause obsolescence of standing products and services, shift the market, and create opportunities for profit. Human centric computing does just that and project manager may find they are involved in implementing these projects of various levels of complexity. Thus, project managers should have a grasp of this technology and even seek solutions in their current projects. 

Human Centric Computing
by 
JT Bogden, PMP

Human centric computing has been around for a long time. Movies for decades have fantasized and romanticized about sentient computers and machines that interfaced with human beings naturally.  More recent movies have taken this to the ultimate end with characters such as Star Trek's Data, Artificial Intelligence A.I.'s character David, or the movie iRobot's character Sonny.   In all cases these machines developed self-awareness or the essence of what is considered to be uniquely human but remained machines.  The movie Bicentennial Man went the opposite direction from a self-aware machine who became human. This is fantasy and there is a practical side to this.

Michael Dertouzos in his book, 'The Unfinished Revolution', discusses early attempts at developing the technologies behind these machines. The current computational technologies are being challenged as the Unfinished Revolution plays out. I am not in full agreement with the common understanding of the Graphical User Interface, GUI, as "a mouse-driven, icon-based interface that replaced the command line interface". This is a technology specific definition that is somewhat limiting and arcane in thinking. A GUI is more akin to a visceral human centric interface in which one form utilizes a mouse and icons. Other forms use touch screens, voice recognition, and holography. Ultimately, the machine interfaces with humans as another human would in the end state.

Human Centric Computing

Humans possess sensory capabilities that are fully information based. Under the auspices of information theory, during the 1960's human sensory has been proven to be consistent with Fourier's Transforms. These are mathematical formulas in which information is represented by signals in terms of time domain frequencies. In lay term, your senses pickup natural events and biologically convert the event to an electrical signals in your nervous system. Those signals have information encoded in them and arrive at the brain where they are processed holographically. The current computational experience touches three of the five senses. The visceral capability provides the greatest information to the user currently because the primary interface is visual and actually part of the brain. The palpable and auditory are the lesser utilized with touch screens, tones, and command recognition. The only reason smell or taste is used is if the machine is fried in a surge which leaves a bad taste in one's mouth. However, all the senses can be utilized since their biological processing is identically the same. The only need is for the correct collection devices or sensors.

Technological Innovations Emerging

If innovations such as these device examples are fully developed and combined with the visceral and palpable capabilities cited earlier truly human centric machines will have merged and the 'face' of the GUI forever will have changed.

Microsoft's new Desktop is literally a desktop that changes the fundamental way humans interact with digital systems by discarding the mouse and keyboard altogether. Bill Gates remarks that the old adage was to place a computer on every desktop. Now Gates remarks that Microsoft is replacing the desktop completely with a computational device. This product increases the utilization of the palpable combined with the visceral in order to sort and organize content then transfer the content between systems with the relative ease of using the finger tips (Microsoft, 2008). For example, a digital camera is set on the surface and the images stored are downloaded then appear as arrayed images on the surface for sorting with your fingertips. Ted Talk's Multi-touch Interface highlights the technology. 

Another visceral and palpable product is the Helio Display. This device eliminates the keyboard and mouse as well. Image appears in three dimensions on a planar field cast into the air using a proprietary technology. Some models permit the use of one’s hands and fingers in order to ‘grab’ holographic objects in mid-air and move them around (IO2, 2007). Another example of this concept is Ted Talk's video Grab a Pixel

On the touch screens of various forms virtual keyboards can be brought up if needed. However, speech software allows for not only speech-to-text translation but also control and instructions.  Speech engines that can provide high quality instructions replacing error tones and help text. Their telephony products are capable of interaction with callers. Their software also comes in 25 languages. (Loquendo, 2008).

There are innumerable human centric projects ongoing. In time, these products will increasingly make it to the market in various forms where they will be further refined and combined into other emerging technologies. One such emerging trend and field is a blending of Virtual Reality and the natural.  The Ted Talks video 'Six Sense' illustrates some of the ongoing projects and efforts to change how we interconnect with systems.

Combining sensory and collection technology with neutral agents may increase the ability to evaluate information bring the computer systems closer to self-awareness and true artificial intelligence.  Imagine a machine capable of intaking an experience then sharing that experience in a human manner. 

Commentary:  Project managers seeking to improve objectives where selection and collection of information can be quickly gathered with out typing or wiping a mouse across the screen, should consider use of these type of products whenever possible.  Although costly now, the cost for these technologies will drop as the new economy sets in. 

References:

Dertouzos, M.L. (2001). The unfinished revolution: human-centered computers and what they can do for us.  (1st ed.), HarperBusiness 

Englander, I. (2003). The Architecture of Computer Hardware and Systems Software: An information Technology Approach. (3rd ed.). New York: John Wiley & Sons Inc.

IO2 Staff Writer, 2007. HelioDisplay, Retrieved 25FEB09 from http://www.io2technology.com/

Microsoft Staff Writer, 2008. Microsoft Surface, 2008, Retrieved 25FEB09 from
 http://www.microsoft.com/surface/product.html#section=The%20Product

Loquendo Staff Writers, 2008. Loquendo Corporate Site, Retrieved 25FEB09 from
 http://www.loquendo.com/en/index.htm

Sunday, March 13, 2016

Hip Pocket Tools To Collect Quick Data


Hip Pocket Tools To Collect Quick Data
By
JT Bogden, PMP

As project managers working in IT, having a grasp of simple activities and practices goes a long way in understanding the complexity behind many projects and how things are inter-related. Writing code to quickly gather browser or network information is one of those simple little things that can have a major impact on a project as they provide a wide breadth of information about a web application's users and environment. This information can be useful across a breadth activities within an organization as well.

Let us look at the use of writing mobile code which sense the environmental conditions. Information collected should be used to properly route the web application to code that accounts for the environmental conditions. Sometimes that means simply adapting dynamically to the qualities like screen width. At other times, code has to account for browser differences. For example, some browsers and versions support features like Geolocation and other do not.

I have scripted some mobile code to detect the current environmental conditions as Blogger would permit, Table 1. The Blogger web application does not allow full featured client side javascript and removes or blocks some code statements. In the code, logic detects the environmental conditions in Internet Explorer, Safari, Chrome, Opera, and Firefox routing the results to the tabular output. Geolocation is tricky and does not execute is all browser versions and may not post results or error results in some browsers and versions. Please try reviewing this post in multiple browsers and platforms (iPad, iPod, PC, MacBooks).

MOBILE CODE RESULTS
Device
Screen Resolution
Client Resolution
Java Enabled
Cookies Enabled
Colors
Full User Agent
GEOLOCATION
Table 1: Sniffer Results
(To collect geolocation, please approve in browser)

Embedding code into web applications and storing the results in a database can yield valuable histories. Project managers planning and coordinating projects, whether writing a web application or conducting some other IT related project, must be able to have an understanding of the environmental situation. Almost always there are anomalies. The use of mobile code can provide valuable information back to the project manager before issues arise.

Mobile code is one of the hip pocket tools that can provide important information. Installing and tracking data over time can show progress, effects, and flush out the anomalies before they become problems.

Neural Agents

Comment: Several years ago, I was the leader of an operationalized telecommunication cell. The purpose of the cell was to monitor the effectiveness and readiness of the telecommunications in support of the ongoing operations. The staff regularly turned over due to the operational tempo and I had to train new staff quickly. I did so by preparing a series of technical briefs on topics the cell dealt with. This brief was dealing with  Neural Agents which I have updated and provided additional postings that paint a picture of potential advanced systems. 

Neural Agents
by
JT Bogden, PMP

Figure 1: Agent Smith
The Matrix movie franchise
Neural Agents have this spooky air about themselves as though they are sentient and have clandestine purpose. The movie franchise the 'The Matrix', Figure 1, made use of Agent Smith as a artificial intelligence designed to eliminate zombie processes in the simulation and human simulations that became rogue such as Neo and Morpheus.  In the end, Agent Smith is given freedom that results in him becoming rogue and rebellious attempting to acquire increasing powers over the simulation. 

The notion of artificial intelligence has been around forever. Hollywood began capturing this idea in epic battles between man and machines in the early days of Sci-Fi.  More recently, the movie "AI" highlighted a future where intelligent machines survive humans. Meanwhile, the Star Trek franchise advances intelligent ships using biological processing and has a race of humanoid machines called the Borg.  Given all the variations of neural technologies, the Neural Agent remains a promising technology emerging in the area of event monitoring but not acting quite as provocative as Agent Smith. The latest development in neural agents in support of artificial intelligence. Neural agents, Neugents which are not related to Ted Nugent, are becoming popular in some enterprise networks. 

Companies can optimize their business and improve their analytical support capabilities as this technology enables a new generation of business applications that can not only analyze conditions in business markets, but they can also predict future conditions and suggest courses of action to take. 

Inside the Neugent

Neural agents are small units or agents, containing hardware and software, that are networked.  Each agent has  processors and contain a small amount of local memory. Communications channels (connections) between the units carry data that is encoded usually on independent low bandwidth telemetry. These units operate solely on their local data and input is received from over the connections to other agents. They transmit their processed information over telemetry to central monitoring software or other agents. 

The idea for neugents came from the desire to produce artificial systems capable of “intelligent” computations similar to those of the human brain. Like the human brain, neugents “learn” by example or observations. For example, a child recognizes colors by examples of colors. Neugents work in a similar way: They learn by observation. By going through this self-learning process, neugents can acquire more knowledge than any expert in a field is capable of achieving.

Neugents improve the effectiveness of managing large environments by detecting complex or unseen patterns in data. They analyze the system for availability and performance. By doing this, neugents can accurately “predict” the likelihood of a problem and even develop enough confidence over time that it will happen. Once a neugent has “learned” the system’s history, it can make its predictions based on the analysis, and it will generate an alert, such as: “There is a 90% chance the system will experience a paging file error in the next 30 minutes”.

How Neugents Differ From Older Agents

Conventional or older agent technology requires someone to work out a step-by-step solution to a problem then code the solution. Neugents, on the other hand, are designed to understand and see patterns, to train. The logic behind the neugent is not discrete but instead symbolic.  They assume responsibility for learning then adapt or program themselves to the situation and even self-organize. This process of adaptive learning increases the neugent's knowledge, enabling it to more accurately predict future system problems and even suggest changes.  While these claims sound far reaching, progress has been made in many areas improving adaptive systems. 


Neugents get more powerful as you use them. The more data it collects, the more it learns. The more it learns, the more accurate its predictions. This solution comes from two complimentary technologies: the ability to perform multi-dimensional pattern recognition based on performance data and the power to monitor the IT environment from an end-to-end business perspective.

Systems Use of Neugents and Benefits

Genuine enterprise management is built on a foundation of sophisticated monitoring. Neugents apply to all areas. They can automatically generate lists for new services and products, determine unusual risks and fraudulent practices, and predict future demand for products, which enable businesses to produce the right amount of inventory at the right time. Neugents help reduce the complexity of the Information Technology (IT) infrastructure and applications by providing predictive capabilities and capacities. The logic behind the neugent is not discrete but instead symbolic. 

Neugents have already made an impact on the operations of lots of Windows Server users who have already tested the technology. They can take two weeks of data, and in a few minutes, train the neural network. Neugents can detect if something’s wrong. They have become a ground-breaking solution that will empower IT to deliver service that today’s digital enterprises require.

With business applications becoming more complex and mission-critical, the us of neugents is more necessary to predict then address performance and availability problems before downtime occurs. By providing true problem prevention, Neugents offer the ability to avoid the significant costs associated with downtime and poor performance. Neugents encapsulate performance data and compare it to previously observed profiles. Using parallel pattern matching and data modeling algorithms, the profiles are compared to identify deviations and calculate the probability of a system problem.

Conclusion

Early prediction and detection of critical system states provide administrators an invaluable tool to manage even the most complex systems. By predicting system failures before they happen, organizations can ensure optimal availability. Early predictions can help increase revenue-generating activities as well as minimizing the associated costs due to system downtime. Neugents alleviate the need to manually write policies to monitor these devices.

Neugents provide the best price/performance for managing large and complex systems. Organizations have discovered that defining an endless variety of event types can be exhausting, expensive and difficult to fix. By providing predictive management, Neugents help achieve application service levels by anticipating problems and avoiding unmanageable alarm traffic as well as onerous policy administration.