Cybernetics is the interdisciplinary study of the structure of complex systems, especially communication processes, control mechanisms, and feedback principles. This field is closely related to control theory and systems theory. At the same time, both in its origins and development in the second-half of the twentieth century, cybernetics is equally applicable to social (that is, language-based) systems.
Cybernetics is a control theory as it is applied to complex systems. Cybernetics is associated with models in which a moAnaie Ampère, in his classification of the sciences, suggested that the still nonexistent science of the control of governments be called cybernetics. The term was soon forgotten, however, and it was not used again until the American mathematician Norbert Wiener published his book Cybernetics in 1948. In that book Wiener made reference to an 1868 article by the British physicist James Clerk Maxwell on governors and pointed out that the term governor is derived, via Latin, from the same Greek word that gives rise to cybernetics. The date of Wiener’s publication is generally accepted as marking the birth of cybernetics as an independent science.
The Greek philosopher Plato used it in his discussions about the analogy between navigating a ship and governing a
country or group of people. In 1840 the term was rediscovered by Ampere in his classification of the sciences.
Early in history, principles from cybernetics were frequently applied without cybernetics being explicitly mentioned. We find the first applications in Arabic and Greek manuscripts around 200 B.C, where control systems are mentioned. Other known instances (in chronological order) are: Archimedes’ automatic waterlevel-regulator for waterclocks; the on-off control as the ‘escapement of the mechanical clock’, designed in China in the Middle Ages; the regulation of the grain supply for a flour-mill by the Frenchman Ramelli in the 16th century, and Cornelius Drebbel’s thermostat. (1593-1633).
The most important technical and social application of a regulating mechanism is ‘Watt’s Governor’, a mechanism developed by James Watt in 1788 regulating the number of revolutions of a steam engine independently of the load. For the first time it became possible to apply the steam-engine on a wide scale.
The principle of Watt’s Governor was worked out mathematically by Maxwell in 1868. Its operation is based on negative feedback. This principle was applied in 1873 by Farcot in a steering mechanism for ships. The great technical breakthrough was when Nyquist and Black in 1931 designed the electronic amplifier based on negative feedback.
A feedback system is said to have a goal, such as maintaining the level of a variable (e.g., water volume, temperature, direction, speed, or blood glucose concentration). Feedback reports the difference between the current state and the goal, and the system acts to correct differences. This process helps ensure stability when disturbances threaten dynamic systems, such as machines, software, organisms, and organizations.
Psychologists have borrowed important insights from cybernetic theory to explain self-regulatory processes (e.g., Powers 1973). The influential work by Carver and Scheier (1981, 1982, 1998) analyzed self-awareness and self-regulation processes in terms of feedback loops that control behavior. In any system (including mechanical ones such as thermostats for heating/cooling systems), control processes depend on monitoring current status, comparing it with goals or standards, and initiating changes where necessary. The basic form of a feedback loop is summarized in the acronym TOTE, which stands for test, operate, test, exit. The test phase involves assessing the current status and comparing it with the goals or ideals. If the comparison yields a discrepancy, some operation is initiated that is designed to remedy the deficit and bring the status into line with what is desired. Repeated tests are performed at intervals during the operation so as to gauge the progress toward the goals. When a test reveals that the desired state has been reached, the final (exit) step is enacted, and the regulatory process is ended.
Emotion plays a central role in the operation of the feedback loop (Carver and Scheier 1998). Naturally, reaching a goal typically brings positive emotions, but such pleasant feelings may also arise simply from making suitable progress toward the goal. Thus, emotion may arise from the rate of change of discrepancy between the current state and the goal or standard. Meanwhile, negative emotions may arise not only when things get worse but also simply when one stands still and thereby fails to make progress. Carver and Scheier (1998) phrase this as a ‘cruise control’ theory of affect and self-regulation, invoking the analogy to an automobile’s cruise control mechanism. Like that mechanism, emotion kicks in to regulate the process whenever the speed deviates from the prescribed rate of progress toward the goal.
The Feedback Loop
The great insight from the field of cybernetics in the twentieth century was its representation of the structure of control. The premise is that any process must be managed via some sort of controller. The controller sends commands to the process and receives information back about its performance. The returning information is evaluated in terms of specified set points that are values against which to compare the information.
Even if cybernetic development is regarded as essentially a branch of engineering rather than philosophy, the appearance of common principles in practical subjects as far apart as astronautics and epilepsy suggests that at least the artificial, academic boundaries between the faculties of physical science, biology, engineering and mathematics can be transcended with advantage and without risk of major error.
The relative modesty of cybernetic achievement (the early claims and promises were certainly over-dramatised) has produced various splinter-groups, some tending toward a more philosophical or at least theoretical position, others concerned with strictly practical application. Among the latter, one of the intriguing titles is “Bionics”, a group in which the precedence and possible superiority of living systems is accepted, with the aim of using ideas gained from the study of real living processes to construct artificial systems with equivalent but superior performance. Thus, a man can easily learn to recognise the appropriate patterns even when they are partly obscured, must be quite complex and carefully adjusted. If we knew more about how we learn to recognise and complete patterns we could make pattern-recognising machines more easily and these could operate in situations (such as cosmic exploration) where men would be uncomfortable or more concerned with other problems.
The cybernetic model of control underlies modern IT methodologies such as Agile, DevOps, cloud, and LeanUX. When combined within one another, these methodologies transform IT into a conversational medium that allows businesses to steer in response to continually evolving customer needs and market demands. Through self-steering, they can continually change and adapt while at the same time maintaining their brand’s essential identity. Service design applies product design techniques to the design of everything from services to organizational processes. Its fundamentally user-centered approach redefines quality in terms of desired outcomes rather than features. Its emphasis on holistic interactions over time and across touchpoints emphasizes the view of service delivery as a co-creative journey. Its grounding in design as a creative practice enables exploration and discovery. Service design expands the scope of design beyond being purely concerned with user interaction. In the process, it helps create customer-focused alignment throughout service organizations. It also raises design’s horizon to address the service delivery process itself as a first-class design problem. By doing so, it lets IT organizations continually improve quality by adapting their technologies and practices in response to organizational and market dynamics.
Wiener defined cybernetics as “the science of control and communications in the animal and machine.” This definition relates cybernetics closely with the theory of automatic control and also with physiology, particularly the physiology of the nervous system. For instance, a “controller” might be the human brain, which might receive signals from a “monitor” (the eyes) regarding the distance between a reaching hand and an object to be picked up. The information sent by the monitor to the controller is called feedback, and on the basis of this feedback the controller might issue instructions to bring the observed behaviour (the reach of the hand) closer to the desired behaviour (the picking up of the object). Indeed, some of the earliest work done in cybernetics was the study of control rules by which human action takes place, with the goal of constructing artificial limbs that could be tied in with the brain.
The origin of Wiener’s interest in this development was the invention of electronic aids to computation toward the end of the war, combined with his personal contact with neurophysiologists who were investigating the mechanisms of nervous conduction and the control of muscular action. Wiener was at once impressed by the similarities of the problems posed by military devices for automatic missile control and those encountered in the reflex activity of the body. As a mathematician and scientist of international repute and wide culture Wiener was as powerfully repelled by the military applications of his skill as he was attracted by its beneficent uses in human biology. In his second book “The Human Use of Human Beings” he develops his humanist, liberal ideas in application to social as well as physiological problems, in the hope that it may not be too late for the human species to find in machines the willing slaves essential for prosperous and cultivated leisure. Writing at a time when the ignominious annihilation of a hundred million innocent bystanders is a calculated risk, as Wiener admits, this is a very faint hope indeed.
A brief analysis.of one cybernetic approach to problems of learning recognition and decision has several interesting corollaries. One is that, even in the metal, such a system provides ample scope for diversity of temperament, disposition, character and personality. In material practice even very simple machines of this type differ very much from one another, even if they are designed to a close specification, and furthermore these differences tend to be cumulatively amplified by experience. In mass-produced passive machines, such as automobiles, individual differences are treated as faults, and are usually minimised by statistical quality control. Even at this level, however, individual characters do appear and particularly when they involve a reflexive sub-system, also tend to increase with wear, which is the equivalent of experience in a passive machine.
In the models already referred to, learning is considered as a statistical rather than a logical process. Logical reasoning, the ability to solve formal problems by deduction, is considered as a special case in which the level of confidence in the data and rules is extremely high. The ability to perform deductive reasoning is thus merely the net result of many interacting statistical processes which cannot be identified individually without some prior knowledge about the mechanism itself. In the case of an assembly of systems such as CORA, acquaintance with the basic principles of exploration, selection, storage and comparison would suggest experiments to measure the characters of performance at each stage. Considering CORA as a “crystallised hypothesis” of living, learning the same procedure could be applied to the study of learning in human beings in the hope of recognising the basic and essential features rather than their statistical sum.
Studies of this nature are now in progress in several centres of research. One of the important inferences from the simple models of learning is that in the far more complex living systems information from the various receptors (eyes, ears, skin and so forth) must be diffusely projected to wide regions of the brain as a part of the preliminary selective procedure. The extent of diffuse projection in the human brain is really astonishing; nearly all parts of the frontal lobes are involved in nearly all sensory integration, and with very short delays. The non-specific responses in these mysterious and typically human brain regions are often larger and always more widespread than those in the specific receiving areas for the particular sense organs”. They also have another very interesting and important property which the specific responses do not show at all, and this is perhaps one of the most fundamental attributes of intelligent machinery, whether in the flesh or in the metal — habituation.
If a stimulus is applied monotonously and without variation in background, the diffuse responses in non-specific brain areas diminish progressively in size until after perhaps fifty repetitions they are invisible against the background of spontaneous intrinsic activity, even with methods of analysis that permit detection of signals much smaller than the background “noise”. This process of habituation is highly contingent however; a small change in the character or rhythm of the stimulus or in its relation to the background activity will immediately restore the response. Interestingly enough the change needed to re-establish significance may be a diminution in intensity; a series of loud auditory stimuli may result in complete habituation after a few minutes but if the same stimulus is given at a very low intensity the response may reappear at a high level. The same effect is seen with any novelty in the rhythm or tempo and the conclusion is that, as predicted from the cybernetic model, the brain response to a single event is a measure of its novelty or innovation rather than of its physical intensity or amplitude.
This observation probably accounts for the apparently (and literally) paradoxical effect described as “sub-liminal perception”. This phenomenon has attracted great interest as a means of “thought control” in advertising or other propaganda; it involves the presentation of a selected stimulus (such as an exhortation to buy a particular product or vote for a certain candidate) at a level of intensity, or for a brief period, below the threshold of “conscious recognition”. Stimuli at “sub-threshold” levels have in fact been found to influence the statistical behaviour of normal human beings without their being aware of the nature or moment of the stimulus. These effects are so subtle and could be so sinister that attempts at sub-liminal influence have been banned in many countries by advertising associations. The paradox of influence by sub-threshold stimuli is resolved by consideration of threshold in terms not of intensity or duration but of unexpectedness or innovation. The mechanisms responsible for distributing signals to the non-specific brain regions constantly compute the information-content of the signals and suppress those that are redundant while novel or surprising signals, however small, are transmitted with amplified intensity.
The effects of information selection are even more involved when the signals, are part of a complex pattern of association. When the response to a given signal has vanished with habituation it may be restored, not only by a change in the original signal itself but also by association of this with another subsequent signal. The response to the paired signals may also habituate, but if the second signal is an “unconditional” stimulus for action (that is, to gratify an appetite, gain a reward or avoid a penalty) habituation does not occur and in fact the first, conditional response shows progressive “contingent amplification”. At the same time the response to the second, “unconditional” stimulus, even if this be more intense that the conditional one, shows contingent occlusion.
The representation of this situation in real life is quite familiar. In driving an automobile one learns first to avoid obstacles, and this is based on the unconditional withdrawal reflex which prevents us colliding with obstacles in any situation. The next stage is to learn to avoid symbolic obstacles — to stop at the red traffic lights for example. The red light is not harmful in itself, it implies the probability of collision, reinforced by police action — it is a conditional stimulus. The action of stopping at an intersection is determined not by the traffic, but by the light. When the light changes to green however, the primary defensive action is restored and the real obstacles must be avoided. The same effect is seen in the brain; when a conditional warning stimulus which has shown contingent amplification is withdrawn the unconditional stimulus which has been occluded, reappears at full size at once. The brain retains the capacity for unconditional training.
A particularly interesting aspect of these observations is the evidence for a dynamic short-term memory system, and here again the resemblance of living processes to those predicted theoretically from cybernetic models is quite startling. In CORA, the third-grade memory, which stores information about significant associations, consists of an electronic oscillatory resonant circuit in which an oscillation is initiated only when the significance of associated events surpasses the arbitrary threshold of significance. This oscillation decays slowly if the association is not repeated or reinforced. Quite recently it was discovered that in records of brain responses to visual stimuli an oscillation appears following the primary response, but only when the visual stimulus has acquired significance, either by irregularity or, more often, by association with unconditional stimuli to which the subject responds with an operant action. These after-rhythms could well be the electric sign of a brain storage system linking the associated stimuli with action. The frequency and phase relations of the after-rhythms are so precise and constant that they may also be operating as a brain-clock, regulating the time-sequence of events in an orderly and effective pattern.
The relation of the conditional responses in the brain and their after-rhythms to the intrinsic brain rhythms, particularly the alpha rhythms, is still a challenging problem from which much may be learned not only about the living brain but also about the design of intelligent machines. Wiener, in his book on Non-Linear Problems and in the second edition of Cybernetics has approached this question from the theoretical standpoint but the facts are even more confusing than he indicates.
In the first place many normal people show no sign of alpha rhythms at all, so whatever function these rhythms mediate must be associated with their suppression rather than with their presence. This is not as unreasonable as it sounds for the alpha rhythms do in fact disappear in states of functional alertness and attention, and the brains of people without alpha rhythms seem to be involved perpetually in the manipulation of visual images. Secondly, the alpha rhythms are usually complex; three or four linked but independent rhythms can often be identified in different brain regions. Third, the alpha waves are not stationary — they sweep over or through the brain. In normal people the direction of sweep is usually from front to back during rest with the eyes shut, but the pattern is broken up and complicated by mental or visual activity. In patients with mental disturbances of the neurotic type the direction of sweep is often reversed to back-to-front, and this effect has been seen for a period of a few months in normal people under severe mental stress. Apart from major disturbances of this sort, the frequency and phase relations of the alpha process are so constant, even in variations of age and temperature, that one is tempted to consider them as ultra-stabilised and to search for a purpose or primary function for them.
Today’s products are often smart (controlled by microprocessors), aware (full of sensors), and connected (to each other and to cloud-based services). These products and services, and our interactions with them, generate increasing volumes of data—just when computer processing is becoming an on-demand utility and pattern-finding software (AI) is advancing.
Today’s designers must consider how information flows through these systems, how data can make operations more efficient and user experiences more meaningful, and how feedback creates opportunities for learning. Knowledge of cybernetics can inform these processes.
For more updates and information, don’t forgot to get in touch with us, Let us know your views about the content published in our platform, just check out the comment section below. Also, you may click here for more info on our categorized Education Domain.