The degree of success when planning human/machine systems, and in particular monitoring and control systems, is entirely dependent upon being able to predict the consequences of different design solutions. The ability to predict the consequences of a specific solution is dependent in turn upon the designers’ insight, knowledge, and experience of the various components that influence the final function. In a human/ machine system—that is, in every system where people work—it is essential to con­sider people as a significant cognitive, emotional, and feeling part of the system.

In certain cases, solutions can be found through common sense, but when plan­ning more complex human/machine systems, common sense alone is seldom ade­quate. Instead, it may create completely the opposite effect. When planning new, advanced technical solutions, interaction with people is not always easy to predict. This is in total contrast with day-to-day situations, where our daily experience pro­vides us with much more information.

In something of a cookbook style, we have attempted to provide the majority of basic recommendations about people as a component in complex human/machine systems. The ability to predict the final functionality of the system will improve by using the knowledge presented in this book. This should lead to a better working sys­tem. In this context ‘better working’ means a system where the final result, the final product, is that which was originally intended. In many instances, great demands have been placed upon factors such as production safety, product quality, and general safety. The possibility of achieving these goals will hopefully be increased by apply­ing the recommendations presented in this book.

There are still people who, when planning, do not believe in the necessity of sys­tematically and seriously taking into account the people who will work with and use the machine and its technology. There are those who believe that a human being as an individual can be taught to fulfil and to adapt to all types of behavioural criteria. This is incorrect. It is important to understand that the human is a highly reliable system component and functions according to predetermined ‘laws’. This reliability is considerably superior to the reliability of the technical components. Let us illus­trate this point.

A tram quite suddenly makes an unscheduled stop at a very busy crossroads; for what purpose? The least likely answer is that the operator—the driver—left the tram to pick flowers. Alternatively, it is highly likely that a technical problem has arisen. The probability of total breakdown of the human component is considerably less than that of corresponding technical parts. On the other hand, humans display many variations and smaller deviations. This is characteristic of the human factor. This is also one of the reasons why errors are attributed, often unfairly, to people rather than technology, which is often the real cause. As technology is often poorly adapted to the individual, the human (with his or her prerequisites and qualifica­tions) has difficulty in achieving certain given performance demands. One often hears about ‘human error’, but these types of errors in fact originate from poor tech­nological solutions. These poor solutions in turn can be attributed to the person who has designed the system and this person’s lack of knowledge in designing systems.