In this conflict of goals between the servo-mechanism and the house occupants an interesting observation can be made. Man made instruments, whether they are simple thermostats or highly complicated and responsive computer systems, are constructed along standards considered relevant by design engineers. Design engineers assign a high value to the mechanisms they create. Thus a higher value is placed on the heater than on the people who have to try and survive the winter in the house. A servo-mechanism of this kind is shown in Figure 1:2 A. Human systems, on the other hand, are more accustomed to taking risks of varying kinds. A heater will be pushed beyond its means to satisfy a condition of cold weather. An automobile will be run at a speed greater than it is designed to operate at. An army will be ordered to march farther than it is believed armies can march. A human mechanism is illustrated in Figure 1 : 2B.
Risks can be considered as either foolish efforts to go beyond the limits of anyone system with only occasional successes, or risks can be understood as decisions by complex systems that other conditions will help offset the risk anticipated. For example, it may be believed that an army can march only ten hours at a time. This belief is based on information about the marching time of armies – that is „x“ numbers of armies have been able to march only ten hours at a time. But a General may dispute this information if he believes that his soldiers are willing to try to exceed the ten hour maximum, or if he sees advantages in the terrain he must traverse, or if he believes his data is based on armies not as well equipped or managed as is his own army. Consequently, he takes a risk in the hope that the information he has about his own force, however incomplete, is more relevant than the data he has received about other armies.
The General in the case above may be said to be „aware“ of the capacities of his soldiers and the task they face together. „Awareness“ is based on a superior amount of information. As the army begins its march, the General can watch very carefully the progress the army makes, and attempt to take account of that progress against the original awareness he had. In cybernetic terms, the General „scans“ the former information – his awareness and the risk he has taken – with the incoming flow of results. At some stage he should become „conscious“ of the real possibilities – whether to continue, or to modify the goal of the march in some way. In effect, by becoming „conscious“ of alternatives he can make sensible decisions about future strategy.
Risk, in this case, has promoted knowledge and opportunities for action that did not exist previously. Moreover, risk has made the General „conscious“ of the capacities of the military system to move across terrain. He would have not known this had he used only the information that he had at the start (that is, that armies could move along at only ten hours per day).
Consciousness of the potentials of a system at anyone time involves certain „costs“ to the system. In the case of the General the decision to try to exceed the known capacities of the system involved committing the system to a policy that could have been dangerous. The General had to demonstrate not only the courage to make the decision, but he also had to be willing to continuously sample the new information he was gathering in order to adjust his decision if necessary. As the discussion of cybernetic analysis unfolds, the importance of consciousness in systems will be expanded.