Top Authors

The primary precept of treatise is that large complex systems are extremely difficult to design correctly despite best intentions and so care must be taken to design smaller less complex systems and to do so with incremental functionality based on close and continual touch with user needs and measures of effectiveness. Parent commenter can toggle NSFW or delete. Will also delete on comment score of -1 or less. The mindset behind Unix is similar:. This is the Unix philosophy: Write programs that do one thing and do it well.

Write programs to work together. Write programs to handle text streams, because that is a universal interface. Use of this site constitutes acceptance of our User Agreement and Privacy Policy. Log in or sign up in seconds. Submit a new link. Submit a new text post. To do so, a system should have the capacity to perceive the situation, which is not the case.

For systems, "The real world is what is reported to the system" p. I think this is one of the most striking lessons of this book. It is astonishing how distorted the views of people working within a company especially in higher positions can become, and this precludes making any significant changes in the company. This is also one of the most significant parallels between human and software systems. A software system is also as good as its fidelity to the real world. Some systems keep on running for a long time as their records of reality and the real world drift apart.

The author also has a really nice name for the ratio of reality that reaches administration to the reality impinging on the system: What about intervening into systems as an outsider? Can one judge their behavior from the outside, without the tinted glasses with which the system sees the world, and improve the system? According to the author, this is possible only if the system already worked at some point. One cannot build a complex system and expect it to work; this is not possible.

Account Options

A working complex system can be achieved only by starting with a small system, and growing it. This is another parallel to software development: Large systems that are designed without one line of code being written, and are then built by separate teams, face huge problems when the time for integration comes. This insight has led to the agile movement, which aims for always working software that is integrated as frequently as possible. What's more, software teams also face a similar issue.

Gathering a large number of developers together, and telling them to build a specific thing does not work either. The best approach is to start with a relatively small team,see what works and what doesn't, establish a culture, and grow around them. How systems deal with errors or fail to do so is one of the most relevant parts of the book for modern technological systems.

Systemantics - Wikipedia

Due to the fact that systems tend to grow and encroach, they will have an infinite number of ways in which they can fail. These ways, and the crucial variables which control failure, can be discovered only once the system is in operation, since the functionality of a complex system cannot be deduced from its part, but only observed during actual functioning. These points lead to the conclusion that "Any large system is going to be operating most of the time in failure mode" p.

It is therefore crucial to know, and not delegate as only extraordinary, what a system does when it fails. This is not so easy, however, since as per the coefficient of fiction, it is difficult for a system to perceive that it is working in error mode,which leads to the principle that "In complex systems, malfunction and even total nonfunction may not be detectable for long periods, if ever" p. If it is so difficult to design systems that work, and keep them working as they grow, how are we supposed to live with them? The obvious step is to avoid introducing new ones, that is, "Do it without a system if you can" p.

If you definitely have to use a system, though, the trick is to design the system with human tendencies rather than against them. In technological systems, this is understood as usability. Another principle is to design systems that are not too tightly coupled in the name of efficiency or correctness.

This is stated as "Loose systems last longer and function better" p. After reading the book, and writing this review, I have only one question in my mind: Why does this book exist? How can it exist? How can it be that so many mind-blowing insights about technological systems were derived by a MD, and recorded in an obscure 80 page book sometime in the seventies? And which other books exist out there that are as good as this one, and are not yet discovered? Dec 26, Ushan rated it really liked it Shelves: Large technological and social systems lose track of their original purpose and turn self-serving; they do not function as designed because their creators forgot Le Chatelier's principle and were unaware of various feedback loops.

The process of observing the systems changes them. Passive safety is better than active safety; when used mindlessly, safety devices and procedures themselves become safety hazards. The examples of systems gone bad are great. An enormous hangar designed to house space rockets and protect them from the elements generates its own weather, so it can rain inside it upon the rockets. When the Fermi I experimental breeder reactor experienced partial meltdown, radioactive sodium was drained and a complicated periscope and pincers were lowered into it; it was found that a foreign object blocked the flow of sodium; the object was later identified as a safety device installed at the very last moment and not documented Perrow also tells this story; Gall is mistaken in calling it an anti-meltdown device: A Peruvian railroad replaced its steam locomotives with diesel ones; they discovered after the fact that diesel locomotives lose most of their power at Andean altitudes, unlike steam ones, but instead of going back to steam, the Peruvians used two hp diesel locomotives where one hp steam locomotive sufficed before.

The Nile used to flood annually and fertilize the Egyptian fields; Nasser built the Aswan dam, which stopped the flooding; the dam produces electricity, which is used to make artificial fertilizer J. McNeill also tells this story in his environmental history of the twentieth century. The examples of ignored feedback are also nice. The Green Revolution caused third-worlders to go hungry as before - but at much higher population densities.

Widespread application of antibiotics caused antibiotic-resistant germs to emerge. On the other hand, Washington D. It kept pilots alert". Every engineer could cite many examples of systems gone bad. So could everybody interested in politics. I wonder if politically Gall is a Reaganite; certainly his book made me think of Reagan's famous remark, "My friends, some years ago, the Federal Government declared war on poverty, and poverty won.

MODERATORS

The United States Federal Government on the one hand, subsidizes farmers and tries to keep food prices and demand for food high, on the other hand, issues food stamps to poor people because food prices are too high, and on the third hand, combats obesity through the National Institute of Health.

The EU countries' governments give out large amounts of aid to poor countries, yet impose high tariffs on agricultural imports from them. Like Stanislaw Lem's King Murdas, they are examples of systems so large that their various parts have minds of their own, sometimes contradicting the minds of other parts. I read this book on a Saturday afternoon. Small book, amusing writing, easy to follow.

This book was published in Don't be surprised if some of the examples, and some of the language, is somewhat dated. The author attempts to be both amusing and academic in his approach. I find most academic writing to be dry and overly intellectual. While the intellectual aspects of this book annoyed me to some degree otherwise it would have 5 stars the humor does shine through.

What are the common charact I read this book on a Saturday afternoon. What are the common characteristics of systems? What things can we say about ALL of these things? So much, in fact, that we consider that normal? Systems tend to spend more time in "failure mode" than in proper working order.

Navigation menu

And attempts to remedy the situation, usually by making the system more sophisticated, only increase probability that it will be in failure mode at any given time. As someone who creates systems for a living I'm a programmer , I tend to think that just a few more tweaks to my shiny new system is all that is needed to get it working properly, reliably.

Instead, I need to be designing systems to be as easy as possible to clean up after when they do fail and the consequences of said failure need to cause as little pain as possible. Because the only thing you can count on is that it WILL fail, at least part of the time. I needed a book to tell me this? It's full of things which should be "duh! But going in, you will probably find yourself with a lot of "AHA! Spend an hour or two with this one.

Systemantics - The Systems Bible

You won't regret it. You'll probably have a couple laughs. And sometimes, we just need to have those things we've been feeling, down in our bones, publicly stated. The hot air balloon was invented by a couple paper makers Montgolfier brothers. And the airplane was invented by a couple bicycle makers. Not by some organization. Nov 01, Nathan Glenn rated it it was amazing. Not what I expected, but still very relevant. I expected something very academic and mathematical.

The author claimed many times that his principles were "axioms", and that they were pristinely mathematical in nature and all self evident. This was a rather annoying claim, since the book was not mathematical at all, nor were the axioms necessarily self-evident though good supporting examples were provided. Despite this, it all still rings perfectly true. A system can be a blessing or a curse, b Not what I expected, but still very relevant.

A system can be a blessing or a curse, but it is guaranteed to have unexpected behavior. When it does something bad, you'd better hope that your system is flexible, changeable, monitorable objectively somehow, that it doesn't completely dominate everything and only allow positive feedback. The style is kind of a mix of Taoist philosophy, design, a tiny bit of math really, barely any , common sense, self-improvement, endearing Latin textbook, and more.

Lately I'd been thinking about all of the generalizable things I'd learned from programming and especially my strengthened dislike of large bureaucracies: All of that is abstracted out of software design and into the real world with this book, which is really quite phenomenal.

Software happens to be just one type of system. Feb 10, Raziel rated it liked it Shelves: At best a nuisance, at worst a menace, on certain rare occasions a godsend. A person who knows all the facts except how the System fits into the larger scheme of things.

Want to add to the discussion?

One who never makes small mistakes while moving toward the grand fallacy McLuhan. For purposes of recognition, a System-persons is someone who wants you to really believe or worse really participate in their System. A set of parts coordinated to accomplish a set of goals. Now we need only to define "parts", "coordinated", and "goals", not to mention "set".

This book was first published in and has gone through several printings. It is a serious book that sometimes masquerades its points with humor.

Review The System Bible Study Research Book and Bible

The general theory supported in the book is that: Two representative corollaries of this theory are: Human failure while a part of many systems is not claimed as the underlying cause and instead it is suggested that the observed difficulties are intrinsic to the system's operation. This theory is hard to validate with our current focus on human error in design and operation. Difficulties notwithstanding, it appears others have attempted to carry this work forward. This is a weird book. I found out about it by accident, and read it on a whim.


  • Welcome to Reddit,?
  • !
  • ;

The book mainly covers how most systems don't work, or work mainly for their own ends, and not the ends set out at the system's inception. It does this through a series of maxims which define general systems behavior. Often times the book is irreverant a lame joke about mental retardation is contained within the first chapter , and the approach isn't exactly scholarly, but it's hard to ignore the basic common sense o This is a weird book. Often times the book is irreverant a lame joke about mental retardation is contained within the first chapter , and the approach isn't exactly scholarly, but it's hard to ignore the basic common sense of what Gall writes.

It's piqued my interest about systemology, regardless, and is a point of view that I hadn't encountered before. In case you were wondering, there doesn't seem to be a definite political bias aside from an apolitical malaise about human made systems , although the book could certainly be read from any number of perspectives. Sep 10, Lou Cordero rated it really liked it. The copy I read is subtitled "How systems work and especially how they fail". Wonderful easy read sheds light and humor on the development of complex systems. The impossibility of solving the problem correctly and completely.

I recommend this book to anyone involved in the design of complex systems. May 26, John rated it it was amazing. I would like this book to be required reading for all high school or college students. It would help dispel the now unhealthy wide-spread blind faith in "systems. A large system Congress for example never does what it says it does. Large systems have their own goals. It is very witty and full of usable wisdom.

Jun 05, Mark Sanchez rated it did not like it. Funny at times, but I'm not sure there was actually much I could take away from it. I did like the use of very short chapters. Presented in a very humorous and entertaining way, this book is packed with ideas that make you stop and think. Why don't things work the way you expect them to? Well, this book will tell you. It might seem discouraging to know that a "Complex System cannot be 'made' to work. It either work's or it doesn't," but when you think it about, it is easier to principle 31 align your system with human motivational vectors, than to keep banging your head against fundamental systems laws.

And never forge Presented in a very humorous and entertaining way, this book is packed with ideas that make you stop and think. And never forget that 'systems will display antics. This book might make you a little cynical, but you'll probably get further ahead in the world if you understand why you are feeling frustrated by the systems you are in.

Sometimes a does of realism is good for us right? Dec 10, Deane Barker rated it it was ok. This book infuriated me. I wanted a serious discussion of systems theory, but what I got was a Dilbert-esque attempt at comedy. The book is incredibly hard to follow. I started off diligently trying to highlight stuff and make sense of it, but the writing is scattered, and is going for laughs most of the time. I gave up trying to treat it as a cogent discussion of anything. If you want to read this book, seriously, just read that appendix and skip all the text that comes before it.

Jan 08, Mack Clair rated it liked it Shelves: The actual book as such lasts only for pages; the remainder is spent on appendices containing bibliographies, indices, and expertise. In the age of the internet, humans are more intimately involved in systems than ever and interact with systems of unprecedented complexity and size. Interesting quick read that really just observes the world and takes it super literally. Even though it was published 40 years ago we still end up dealing with the same problems in software development and maintenance. If I had to sum it up in a few sentences: