What is meant by Essential Complexity?
I have borrowed the term Essential Complexity from a famous essay written by Fred Brooks written in 1986 called "No Silver Bullet".
In the essay, Brooks argues that business systems are inherently complex, not due to side effects, but because we need them to be. If an application like Microsoft Word needs to do 10,000 different things, then the program must support 10,000 different functions. Microprocessors on the otherhands (such as the Intel Pentium processor) while seemingly as complicated as the Word processor, are in fact from a functionality perspective quite simple: The instruction set they implement is very small.
However, in attempting to implement business functionality (The Essential Complexity), architects often introduce what is known as Accidental Complexity. This is the complexity that has nothing to do with the business requirements, but exists due to limitations in existing technology. It is the Accidental Complexity of systems that leads to most of the problems we experience when we are let down by computers. For example, when your computer freezes up, or you can't get on the network, or a server is unavailable. These are all problems pertaining to Accidental Complexity. The annoying virus checker you are forced run which slows down your computer, and causes all sorts of other problems is surely a symptom of Accidental Complexity.
What I personally believe through my experiences having been a computer user for over 25 years, and being in the IT industry for over 12 years is that the proportion of issues pertaining to Accidental Complexity is going down compared to the proportion of issues pertaining to Essential Complexity which is going up. For example, 10 years ago when a Knowledge Worker had an issue with an IT system, chances are it was a "blue screen of death", or maybe they couldn't log on to a server because it was down because some other user had inadvertantly crashed it, or maybe the entire network was down because the network was using a hub (and not a switch) and an ethernet cable was broken. Nowadays those issues are rarer: Workstations running Windows 2000 or XP have protected memory, and rarely crash. Modern servers run on virtual machines such as the JVM (Java Virtual Machine) or Microsoft's CLR (Common Language Runtime) which have built-in automatic memory management (the most common reason for critical application failures are illegal memory operations). Server applications now come standard with failover technology, requiring minimal configuration. Networks themselves have few if any single points of failure.
So... our workstations are stable, our servers are available (and so are the applications they run), and our networks are reliable. This is a great accomplishment. However, if Economics 101 has taught us anything, it's that people are never satisfied, a la The Law of Diminishing Marginal Utility. What people are complaining about now are the semantics or meanings behind the data and systems. People are now - more than ever - spending time looking at reports asking what the numbers mean. Furthermore, they are looking at their systems and asking the same questions: What missions is this system accomplishing? Who uses this system? How often do these business rules get invoked? Are these business rules more conducive to customer growth or not? In this regard, most systems are entirely opaque, and can only be analyzed a great cost.
These are all questions that up until recently IT has been happy to defer to "the business prime". Unfortunately, the business prime may not have the adequate training or background to answer these questions, let alone manage this level of complexity. These issues then tend to fall into a grey area that nobody feels responsible for, but which affects everybody.
What has happened is that because Accidental Complexity has gone down, it has enabled us to architect and construct systems with even greater Essential Complexity. This Essential Complexity has now spun out of control, and we are now in search of new tools and governance to manage this ever increasing Essential Complexity. We are also in search of Peter Drucker's elusive "Knowledge Worker": The business specialist who works in concert with the rest of the corporation and can seamlessly leverage the corporation's knowledge, unobstructed by limitations of technology.
As many of the issues pertaining to Essential Complexity are on the critical path of Enterprise Architecture and Data Management initiatives, I felt it was an appropriate title for this blog.
Despite what Alvin Toffler may have believed, I am convinced we are barely at the dawn of The Information Age. As John Zachman has stated, we don't even know what The Information Age looks like as our thinking has been borne out of The Industrial Age. He compares it to the Cook Island syndrome: The natives of the Cook Islands could not see James Cook approaching for the first time because they had no frame of reference for a sailing ship, and thus it never registered until the ship had moored close enough to shore.
I believe we are at the cusp of a new era, which entails a new way of thinking. Try to think about how people's mindsets were changing during the cusp of the industrial revolution prior to when Specialization became the norm. It is my intent to make this blog an exploration of the Information Age looking at both closed corporate information issues and open internet information issues, where they intersect, and where they are headed.
Let this blog be your compass...
Friday, October 27, 2006
First Post: Essential Complexity
Posted by Neil Hepburn at 10:56 AM 0 comments
Subscribe to:
Posts (Atom)