Why Systems Fail and Problems Sprout Anew Commentary on the principles of 'Systemantics'

Review of the book. Systemantics; how systems work... and especially how they fail by John Gall. Pocket Books, 1978

Introduction

At last those concerned with social change have a basic textbook to explain why 'things generally are indeed not working very well' despite our many efforts. As is remarked on the cover: 'Have you ever wondered why the unsinkable Titanic sank... or the poor in India eat better bread than the rich in America... or hospital patients are blamed for not getting well... or why, in general, things that don't work badly don't work at all ?' Similar questions are of deep concern to those working in international organizations.
The author, John Gall, explains his point of departure in the following words:
'The religious person may blame it on. original sin. The historian may cite the force of trends such as population growth and industrialization. The sociologist offers reasons rooted in the peculiarities of human associations. Reformers blame it all on 'the system', and propose new systems that would, they assert, guarantee a brave new world of justice, peace, and abundance. Everyone, it seems, has his own idea of what the problem is and how it can be corrected. But all agree on one point- that their own system would work very well if only it were universally adopted.
The point of view espoused in this essay is more radical and at the same time more pessimistic. Stated as succinctly as possible: the fundamental problem does not lie in any particular system but rather in systems as such. Salvation, if it is attainable at all, even partially, is to be sought in a deeper understanding of the ways of systems, not simply in a criticism of the errors of a particular system'. (page 16)
Gall's book takes the reader step by step through a series of explanations necessary to an appropriate understanding of 'how systems work... and especially how they fail' (the subtitle of the book). For as he says 'men do not yet understand the basic laws governing the behavior of complex organizations'. Some of the axioms that he has so cleverly grouped together have been known to us or have formed the subject of secret suspicions we have shared in confidence with close friends. But here we find these matters brought into the open at last in 'a first approach' to a systematic exposition of the fundamental principles- the first attempt 'to deal with the cussedness of systems in a fundamental, logical way, by getting at the basic rules of their behavior'
He cites with humble gratitude the giants who paved the way for his efforts:
  • Murphy: 'If anything can go wrong, it will'.
  • Korzybski, author of General Semantics, who contributed: 'a vaulting effort at a comprehensive explanation of Why Things Don't Work,'; and not forgetting
  • Potter, author of One-upmanship; nor Parkinson (Awarded the Noble Prize in 1977 by the Association for the Promotion of Humour in Intentional Affairs), author of Parkinson's Law and other studies in administration, whose central premise was that 'Work expands to fill the time available'': nor
  • Peter, author of The Peter Principle: that 'People are promoted up to the level at which they function incompetently'.
'Systemantics' is such an essential work for those working in (and especially with) international organizations that it is important that they should not be discouraged by any belief that it is primarily concerned with matters outside their frame of reference. For this reason we list below the 'Basic Systems Axioms, etc' from the book with indications as to how (in the reviewer's opinion) they relate to the domain of international organizations in particular (rather than to the full range of systems created by humans, for such is the wide applicability of the author's insights). It is however essential to read the text to gain a full understanding of the application of these principles and all the consequences resulting from them.

Gall's Basic Systems Axioms

1. Systems in general work poorly or not at all
This is almost self-evident to those with any experience with the international system, its sub-systems, or with efforts to set up world-wide systems to solve key world problems. Practitioner's would undoubtedly feel more at home with one of his alternative formulations: Nothing complicated works.
2. New systems generate new problems
This principle, known to many of us, has never been admitted by international organizations. It is always assumed (or desperately hoped) that a new system will eliminate more problems than it generates - and that the latter, if present, will be the responsibility of some other organization or department. Gall is able to demonstrate that the new situation is in fact much worse than the old because people come to rely on the system's supposed ability to eliminate problems.
3. Systems operate by redistributing energy into different forms and into accumulations of different sizes
With a brilliant stroke of genius the author was able to deduce from the previous principle that the total problem complex facing the human community is unchanged by organized intervention - the problems merely change their form, their distribution and their relative importance, namely that: The total amount of energy in the universe is fixed. The new term 'energy' is defined as 'any state or condition of the universe, or any portion of it, that requires the expenditure of human effort or ingenuity to bring it into line with human desires, needs, or pleasures'... namely a problem. In his own explorations of these fundamental questions this reviewer has noted that:
'Frequently social problem can be eliminated to the satisfaction of all concerned (from the electorate to the policy-maker) by eliminating the particular set of symptoms by which it was recognized and which gave rise to the call for remedial action. Action of this kind merely ensures that a new set of symptoms emerges in some other social domain. The new set may well be considered more acceptable or may be less easy to focus on as the basis for an effective campaign for remedial action. Some time will also be required before the new set of symptoms can be effectively recognized.
It may in fact be very difficult for an organization to see that its programs merely displace a problem into the jurisdiction of some other body- whose own actions will eventually result in the problem being displaced back again or into the jurisdiction of a third body. (Institutions may deliberately move problems through a network of jurisdictions as a way of legitimating their own continued existence.) Such displacement may be difficult to detect because one set of symptoms may be apparent in legislation (e.g. legal discrimination), but when eliminated may then take on an economic character (e.g. economic discrimination), which if eliminated may then take on a social character (e.g. social discrimination), and then a cultural character, etc. Such displacement chains may loop back on themselves and develop side chains which are difficult to detect since each organization is insensitive to the problem symptoms in its own domain and considers symptoms of the same problem in other domains to be acceptable or of secondary importance (1).
To the extent that this is correct, it is certainly difficult to establish that the underlying problem matrix has been reduced by 'success' with a particular problem.
4. Systems tend to grow, and as they grow, they encroach
Here again those familiar with international agencies have been exposed to a multitude of cases of encroachment by one agency (or more) on another. As Hasan Ozbekhan put it with regard to subsystems, during an OECD Symposium on Long-range Forecasting and Planning:
'In every instance we might name, the same dynamics appear to be at work: a reflexive attempt on the part of each major institution to expand its planning over the space of the whole system... This almost subconsciously motivated attempt, that of a sector to expand over the whole space of the system in its own particular terms and in accordance with its own particular outlooks and traditions, compounds the problem by further fragmenting the wholeness of the system' (2).
Gall suggests that the above principle should be extended to: Systems tend to expand to fill the known universe. Known to them, might be an appropriate qualifier. And indeed one may suspect that many international organizations consider that they have a right to preoccupy themselves with any problem known to them in whatever domain, irrespective of any other organization's actions. This has been remarked with respect to practitioners of disciplines. 'It would be rare indeed if a representative of any one of these disciplines did not feel that his approach to a particular organizational problem would be very fruitful, if not the most fruitful' (3).
5. Complex systems exhibit unpredictable behaviour
Many strange tales circulate within the international community concerning peculiar happenings which are treated as normal, and inconsistencies which are accepted without a qualm. At the time this is being written, for example, there is a proposal for a full UN General Assembly debate on UFOs, following an extensive debate in 1977 by the UN Special Political Committee. If it is accepted, more time will have been given to the matter than has ever been given to international NGOs. It would indeed have been difficult to predict such behaviour in 1976
Is one to assume that UFOs are more visible, or less obscure entities, within UN circles - namely that UFOs have greater political impact ? Or that the UN finds it safer to debate extra-terrestrial rather than terrestrial matters - especially since there seems little danger of pressure group action from the group in question ? Or are Member States dismayed at the UFOs' fulsome demonstration of the transnational spirit - in their apparent disregard for the sacred boundaries of sovereign States ? Or perhaps it is the 'proliferation' of UFOs which is troubling the UN -- as in its dealings with NGOs ?
6. Complex systems tend to oppose their own proper function
Otherwise known as Le Chatelier's Principle, this has been described by Stafford Beer as follows:
'Reformers, critics of institutions, consultants in innovation, people in sort who 'want to get something done', often fail to see this point They cannot understand why their strictures, advice or demands do not result effective change. They expect either to achieve a measure of success in their own terms or to be flung off the premises. But an ultrastable system (like a social institution)... has no need to react in either of these ways. It specialises in equilibrial readjustment which is to the observer a secret form of change requiring no actual alteration in the macro-systemic characteristics that he is trying to do something about' (4).
Gall himself considers it to be a manifestation of a widespread phenomenon known as 'administrative encirclement', whereby, for example, the administrators 'whose original purpose was to keep track of writing supplies for the professors, now have the upper hand and sit in judgment on their former masters'.
7. People in systems do not do what the system says they are doing
It has long been evident to those concerned with the international system that the people in the agencies are not engaged in action to remedy world problems - as the systems would claim - but rather in administrative preoccupations whose relationship to such problems may be remarkably tenuous. As Gall says, 'the larger and more complex the system, the less the resemblance between the true function and the name it bears''.
8. A function performed by a larger system is not operationally identical to the function of the same name performed by a smaller system.
Gall explains this with the problem of obtaining a fresh apple. The larger and more complex the delivery system, the less likely it is that the apple will be as fresh as if picked from the garden by oneself. From which he deduces a point of the utmost importance for international action, and for the new world order, namely that most of the things we human beings desire are non-systems things - but the system has other goals and other people in mind.
9. The real world is whatever is reported to the system
This is a point which has been explored in depth by Kenneth Boulding in his famous book 'The Image' (5). Reality becomes the image of reality, however poorly it is represented. There are many examples of this within the international system which has a remarkable capacity for 'discovering' some new principle or truth long after it has been current in the wider society. As Gall remarks: 'to those within a system, the outside reality tends to pale and disappear'. This weakness is reinforced, perhaps deliberately, by the system's complex reporting procedure - which is often so cumbersome that it is always able to claim plaintively 'we were not informed', in cases when it did not want to be informed. Gall describes a significant breakthrough by which the 'amount of reality' reaching an administrative officer can be indicated with precision.
10. Systems attract systems people
Not only, as argued above, do the international systems isolate those who work within them by (a) feeding them a distorted and partial version of the external world, and (b) giving them the illusion of power and effectiveness, they also attract people with attributes for success within the system (irrespective of the problems with which it is supposedly concerned), or who are able to thrive parasitically at the expense of the system. Gall goes to the heart of the matter when he points out that only the ancient Egyptians had a solution to this problem: each fob was represented by two people - the honorary officeholder, and the actual executive.
11. The bigger the system, the narrower and more specialised the interface with individuals
The irony of the opening words of the UN Charter has often been pointed out this context ('We the peoples...'). Gall argues that in 'very large' systems, the relationship is not with people but with social security and sundry other numbers. But in really large systems, there is no relationship at all. What hope would there be with a 'world government', or a new world order when the 'people organizations' are those most neglected by such large systems.
12. A complex system cannot be 'made' to work; it either works or it doesn't
There is still a widespread belief that a complex international system can be made to work by appropriately tinkering with its components and their linkages. New factions are constantly putting forward claims that they know how to make it work. A lot of hope is put into the possibility that one of them may be lucky - a lot of time is also wasted in anticipation of such an improbable event.
13. A simple system may or may not work
Those simple systems that work within the international community are 'rare and precious additions to the armamentarium of human technology. They should be treasured'. Unfortunately, Gall notes, they are often characterised by instability requiring special skill in their operation. Replacing 'the crazy genius in a smoked-filled attic' by a computer program to handle some complex scheduling job may lead to a very expensive disaster
14. If a system is working, leave it alone
Gall notes that 'Although many of the world's frustrations are rooted in the malfunctions of complex systems, it is important to remember that some complex systems actually function'. When this occurs, 'humble thanks' should be offered.
15. A complex system that works is invariably found to have evolved from a simple system that works
See under point 16.
16. A complex system designed from scratch never works and cannot be patched up to make it work; you have to start over, beginning with a working simple system.
The author claims to have searched diligently for exceptions to these two axioms but without success me The League of Nations ? No. The United Nations ? Hardly. Nevertheless, the conviction persist among some that a working complex system will be found somewhere to have been established de nova, from scratch'. There is still hope for the New International Economic Order.
17. In complex systems, malfunction and even total nonfunction may not be detectable for long periods, if ever.
Again those familiar with international agencies will not be surprised by this. Major international programmes have operated for decades before being proved a complete failure. On a much smaller scale there is the delightful story of the office tucked away in a major agency which for many years prepared periodic issues of a 'current bibliography' with regular budgetary approval. No provision had ever been made, however, for the publication and distribution of the successive issues prepared and no one was aware of the work done, or made any use of it.
18. Large complex systems are beyond human capacity to evaluate
In support of this Gall cites C W Churchman:
'In general, we can say that the larger the system becomes, the more the parts interact, the more difficult it is to understand environmental constraints, the more obscure becomes the problem of what resources should be made available, and deepest of all, the more difficult becomes the problem of the legitimate values of the system' (6).
19. A system that performs a certain way will continue to operate in that way regardless of the need or of changed conditions.
The inertia of large bureaucracies is a well-recognized phenomenon. This does not prevent their advocates from believing that such agencies are well able to adjust rapidly to changing circumstances - to a crisis of multiple crises, for example. Donald Schon has drawn attention to the fact that many organizations are memorials to old problems.
20. Systems develop goals of their own the instant they come into being.
And such goals can be only indirectly related to those for which the system was established. This is a reason to be concerned with plans to create a world government to solve problems we have not been able to handle nationally. Bigger systems do not necessarily lead to better solutions.
21. Intra-system goals come first
Gall notes:
'The reader who masters this powerful axiom can readily comprehend why the United Nations recently suspended, for an entire day, its efforts at dealing with drought, detente, and desert oil, in order to debate whether UN employees should continue to ride first class on airplanes'.
There are other, and more biting, examples of this point.
22. Complex systems usually operate in failure mode
Clearly the more complex the system, the more probable it is that some parts will be under repair, 'unavailable', or on holiday. The appropriate question is then not how an international agency ought to function, but how it actually functions in the normal absence of some parts (especially during the holiday months June to September, for example, or before the end of the post-prandial coffee break). This corresponds to the reviewer's insight, following a recent visit to a developing country, that we should primarily be concerned with inter-system conditions, namely those not covered by working systems for whatever reason. Organized chaos can be most instructive, particularly as a model for the post-petroleum epoch.
23. A complex system can fail in an infinite number of ways
Those who recognize the possibility of failure cannot hope to design effectively' against it as has been shown time and again. It might almost be said that such systems generate new methods of failure and educate people into increasing acceptance of them. In fact the international system may be characterised by the contrast between the extraordinarily high expectations of those who do not know its limitations and the extraordinarily low expectations of those who do.
24. The mode of failure of a complex system cannot ordinarily be predicted.
Donald Schon pointed out that the institutional complex that is supposed to contain the problem complex is in fact always out of phase with it. The implication is that a completely new approach is required, relying heavily on a network of bodies so constituted that it can rapidly restructure itself in response to any new problem configuration. The current institutional heavy artillery is just not sufficiently manoeuverable in a moving battle in difficult terrain.
25. The crucial variables are discovered by accident
Gall points out that the moment an institution is established to research into a new problem we are immediately faced with all the systems characteristics noted above. It is seemingly impossible for the system to achieve its goal - unless there is a 'happy accident' of which there are many well-known examples (e.g. the discovery of nylon). In fact the crucial variables tend to be discovered by those with the 'wrong' education, the 'wrong' institutional framework and usually without intending to do so. Perhaps this is a good reason for encouraging a proliferation of organizations with strange preoccupations.
26. The larger the system, the greater the possibility of unexpected failure
Those concerned with a new world order, or the possibility of world government must face up to this.
27. 'Success' or 'function' in any system may be failure in the larger or smaller systems to which it is connected
This is a most important point for those who rely on the indicators designed, and provided, for the system they work in. However successful it may appear, or however much progress is regularly reported, the system may in fact merely be functioning as a problem reprocessing machine. Such machines take in problems of one type and transform them into problems of another type (by 'solving' them). The new problems are not perceived as such, however, because they are carefully designed to be undetectable to the indicators of significance to the system. Alternatively they may be so well packaged and labelled that they are even claimed as positive contributions to society.
28. When a fail-safe system fails, it fails by failing to fail safe.
This is of course a point which has been well-recognized by those involved in the international campaign against nuclear energy and weaponry. But it can also apply to bureaucratic procedures with special escape clauses to safeguard against failure to deal with (urgent) humanitarian cases.
29. Complex systems tend to produce complex responses (not solutions) to problems.
World problems have given rise to very complex legal and instrumental responses, but it is certainly not clear that remedial action is achieving its aims- at least if one looks beyond the literature put out by public relations departments or the documents governed by the bureaucratic 'positive/optimistic' standard of reporting (with appropriate suppression of inconvenient facts).
30. Great advances are not produced by systems designed to produce great advances.
This follows from point 25. Gall points out: 'Systems can do many things, but one thing they emphatically cannot do is to solve problems. This is because problem-solving is not a systems-function and there is no satisfactory systems-approximation to the solution of a problem. A system represents someone's solution to a problem. The system does not solve the problem. Yet, whenever a particular problem is large enough and puzzling enough to be considered a capital 'P Problem, men rush in to solve it by means of a System'. The international problem-solving institutions, existing or proposed, cannot be taken seriously until the implications of this point are examined. Gall notes that the solutions usually come from bodies whose qualifications would never satisfy a selection committee. If this is the case, and many examples are available, what sort of international network of bodies is required ?
31. Systems aligned with human motivational vectors will sometimes work; systems opposing such vectors work poorly or not at all.
There are already a number of examples of powerful international agency information systems that have failed because they ran up against the real priorities and interests of those they were designed to serve.
32. Loose systems last longer and work better
Gall points out that efficient systems are dangerous to themselves and to others whether they survive, attempt to survive, or fail. The notion of a 'loose system' of course approximates the current tentative understanding of a network. How to facilitate network action and network building is something that is regularly explored in these columns. A breakthrough is needed.

In conclusion

The book is fun but also challenging to the reader who is constantly faced with the question 'just how true is this in fact ?' -- given the examples cited by the author or known to the reader. That there is an underlying profundity is difficult to deny.
Having been engaged in the production of a Yearbook of World Problems ant Human Potential (7), this consequently provoked reflection on the difficulties of designing an adequate response to such problems. This resulted in the production of a document on The Limits to Human Potential (1) which also attempted to grapple with some of the issues so successfully itemized by Gall. Further work on the constraints to action by the international community is required so that less reliance is placed up on out-dated structures, and more adequate ones car be designed.

To complement the 'Systemantics' perspective of Gall, it is appropriate to note the existence of a charming publication by a professor of international economics, Carlo M Cipolla. The editors are indebted to the network of the Association for the Promotion of Humour in Intentional Affairs for informing us of its existence. It has been privately printed under the following title:

The Basic Laws of Human Stupidity

'Human affairs are admittedly in a deplorable state. This, however, is no novelty. As far back as we can see, human affairs have always been in a deplorable state... After Darwin we know that we share our origin with the lower members of the animal kingdom, and worms as well as elephants have to bear their daily share of trials, predicaments, and ordeals. Human beings, however, are privileged in so far as they have to bear an extra load - an extra dose of tribulations originated daily by a group of people within the human race itself. This... is an unorganised unchartered group which has no chief, no president, no bylaws and yet manages to operate in perfect unison, as if guided by an invisible hand, in such a way that the activity of each member powerfully contributes to strengthen and amplify the effectiveness of the activity of all other members. The nature, character and behaviour of the members of this group are the subject of the following pages' (page 5)

Cipolla's Five Basic Laws are:

1. Always and inevitably everyone underestimates the number of stupid individuals in circulation.
2. The probability that a certain person be stupid is independent of any other characteristic of that person (2).
3. A stupid person is a person who causes losses to another person or to a group of persons while himself deriving no gain and even possibly incurring losses.
4. Non-stupid people always underestimate the damaging power of stupid individuals. In particular non-stupid people constantly forget that at all times and places and under any circumstances to deal and/or associate with stupid people infallibly turns out to be a costly mistake.
5. A stupid person is the most dangerous type of person.
The author demonstrates that stupidity is an indiscriminate privilege of all human groups, irrespective of race, class, creed or level of education (including Nobel laureates). It is uniformly distributed according to a constant proportion. He notes: .. The underdeveloped of the Third World will probably take solace at the Second Basic Law as they can find in it the proof that after all the developed are not so developed'.
Unfortunately, Cipolla fails to consider how the world would function without 'stupid people'. For without the problems they create, there would be nothing for the 'non-stupid' people to do. Every action requires an equal and opposite reaction!
Carlo M Cipolla. The Basic Laws of Human Stupidity. Bologna, The Mad Millers (Imola, Italy, Grafiche Galeati) 1976, 30 p.

Hazards of System Building

by Mathew Melko, System Builder
(Offered to participants at the Foundation for Integrative Education Conference, Oswego, New York, August 1969; 
reproduced in Main Currents in Modern Thought, 26, 2)
1. You identify with your system. It cost you blood to build it, and if it is attacked, it is your blood that is being shed.
2. You cannot tolerate tentativeness, suspension of judgment, or anything that does not fit the system.
3. You cannot apprehend anyone else's system unless it supports yours.
4. You believe that other systems are based on selected data.
5. Commitment to systems other than your own is fanaticism.
6. You come to believe that your system entitles you to proprietorship of the entities within it.
7. Since humor involves incongruity, and your system explains all seeming incongruities, you lose your sense of humor.
8. You lose you humility.
9. You accept all those points - insofar as they apply to builders of other systems.
10. So do 1. (P.S. I hope I believe in the cult of fallibility)

References

1. Anthony Judge. Limits to Human Potential. Brussels, Mankind 2000,1976. (Partially reproduced in International Associations, 28, 1976, 10, pp. 444-6; 29, 1977, 4, pp. 147-150). [text]
2. Hasan Ozbekhan. Toward a general theory of planning. In: Perspectives on Planning. Paris, OECD, 1969, pp. 83-84.
3. R.L. Ackoff. Systems, organisations, and interdisciplinary research. General Systems,1960, vol.5.
4. Stafford Beer. The Cybernetic Cytoblast: management itself. September 1969 (Chairman's Address to the International Cybernetics Congress
5. Kenneth Boulding. The Image. University of Michigan, 1956.
6. C. West Churchman. The Systems Approach. Dell Publishing Co, 1966, p.77.
7. Yearbook of World Problems and Human Potential. Brussels, Union of International Associations and Mankind 2000, 1976, 1136 pages. (Now titled Encyclopedia of World Problems and Human Potential, 1994-5) [commentary]

Comments

Popular posts from this blog

Awareness of EBE Contact

The Mystery of Rh-Negative Blood Genetic Origin Unknown

American Airlines Flight 77 Evidence