Wednesday, January 19, 2005
For each class of resources identified, we can create a set of documents (messages) that go in to and come out of the resources to perform business functions. We then further analyze the messages in terms of their overall effect on resources. Some messages will simply inquire about the state of a resource ("get outstanding invoices" message to a customer resource for example). Others will update the resource ("Add new machine" message to an assembly line resource for example.). Still others will delete the resource ("clear all" message to a shopping basket resource for example). We then map these message classes onto an application protocol - HTTP. Essentially, this gives us standard plumbing - a standard interface -supported by all resources for the exchange of messages. This encapsulation of complexity behind a simple message exchange pattern layered on top of a standard application protocol, makes it easy for developers to use business resources in their applications. Moreover, it allows the developers of the business resources to add extra message types as business needs change. By following some simple design rules, existing resources can be made forward compatible with this sort of change so that nothing breaks as the system evolves. Moreover, the loose coupling implied by the message exchange approach allows developers to use intermediating resources to gracefully "broker" complex modifications over time to minimize system changes. Now doesn't that sound just great? I think so. Why is it a distortion of reality? Well, it is a distortion of reality because systems are not currently - by and large - architected this way. I suspect that will change though. Somewhere in the terminological soup of web services, REST, SOA, SOI, SOBA, ESB, MOM and Indigo, a new conceptualization of "business object" is taking shape and will incorporates at least some of the ideas presented here.
Sunday, January 16, 2005
Unfortunately, many realistic systems and their models exhibit behaviours that lead to state spaces too large and too complex to deal with using any of the methods above. And although we often wanted to believe differently, most of the real world examples to which we want to apply our techniques, are way out of reach. A typical example is a new railway safety control philosophy developed in the Netherlands  allowing a more efficient and reliable railway transport. One of the reasons that it will not be taken into service is that nobody can convincingly certify its correct operation. Until this is possible, it seems that the Dutch railway companies will stick to the proven, simple control systems to which they are accustomed, denying customers the benefits of the new technology. In those cases where automatic methods are insufficiently helpful, manual manipulation can save the day. Typically, protocols and algorithms deal with unspecified or unbounded data, are about an unbounded number of communicating partners and are often quite complex. In order to prove the correctness of such systems human ingenuity and intuition are indispensable. The ability to include the human intellect into the verification effort will be a distinguishing success factor for many decades to come.
Saturday, January 15, 2005
However we need to continue to take care that we do not consider The Model to be The Truth. The web based internet is a massive organic process that is similar to Nature, and we can develop models to observe its phenomena. We can use these models to build our tools on, but we have keep in mind that we cannot use the model force the organic process to behave the way we want it to. Whether we use the REST model, or another model to be developed that appears to match it closer or from a different perspective, "the web" and other large scale distributed systems will continue to do "their thing", whatever model we put on it. The distributed, decentralized, bottom-up, autonomous nature of the web, exhibits complex organic interactions, that are not driven by models or laws, just as that Nature is not driven by the laws of Physics. We must learn from Physics that models are imperfect and only an approximation of something that is much larger, and more complex that we can imagine. Models can be improved or replaced by others, and competing models can exist at the same time. But in the end they are just models. They help us understand Nature, but they are not Nature itself.
Thursday, January 13, 2005
I have made a crude estimate comparing the number of possible configurations in a given building design problem.
..The number of configurations both in the good pile and in the all pile are immense -- immense beyond imagining. There is therefor no shortage of good solutions to any given problem. But it is the ratio of the two numbers which staggers the imagination. The ration between the two numbers is, in rough terms, about 10E12000. Furthermore, although there are huge numbers of possible good configurations, these good ones are sparsely scattered throughout configuration space
..In general we may characterize this task, as a task of walking through configuration space, until we reach good results. The assumption is that there are (indeed, there must be) some kinds of paths through configuration space which can get a system to the good places.
..In Book 2. I have shown how living structure arises when reached by a series of movements in configuration space which are "structure-preserving" paths.
..These are real explanations, which have practical effects in real practical buildings. And what it amounts to, in informal language, is that the transformations represent a coded and precise way that aesthetics - the impulse towards beauty - play a decisive role in the co-adaptation of complex systems.
Friday, January 07, 2005
If you do incremental design in an uncontrolled way, you are most likely to end up with a design that is a mess - to make incremental design work you need something that makes the design converge into order. In Is Design Dead I referred to these as enabling practices. For software design I labeled testing, continuous integration, and refactoring as key enabling practices to get software design to converge and avoid software entropy.
Monday, January 03, 2005
From working on systems in a half dozen domains I have come to the conclusion there are three simple patterns we should lean on for making documents and messages persistent.
- Tuples Spaces (think Ruple Forums) for in-progress, state machine-like transactions and collaboration.
- Versioned document trees (think Subversion's file system) for long-lived, shared, document editing.
- Star Schema-based storage (think Sybase IQ, a simple, low-maintenance, scalable database technology) for read-only transaction history and analysis.
I deeply believe (it is my hypothesis) that these are the only patterns you need, and the implementations going forward can become far simpler and adaptable than all of today's cruft. As for "messaging" (the gerund, i.e. SOAP and its ilk), WS-xxx is a band wagon approach toward more complexity. Messaging protocols are a means to an end (i.e. getting a message into one of these persistent "end point like" locations). All collaboration can take place via these "persistence" mechanisms. They are actually coordination mechanisms, and persistence is a by-product since collaboration is frequently different-time and/or different-place, and even when computing is solo or collaboration is same-time and same-place, you often would like a history.