<$BlogRSDUrl$>

Thursday, October 30, 2003

Blogging BOF at PDC 

The PDC was probably the most blogged event of all time. I went to the Blogger BOF and picked up a few ideas that took a while to sink in. One interesting comment that was made was how the evolution of blogging was not just driven by geeks, but by teenage girls pouring their innermost thoughts out onto the Web. These hundreds of thousands of experiments in collaborative communications, some of them of near suicidal intensity are helping figure out how collaboration on the web works. The result of these experiments has had a dramatic impact on how I thing of applications. Even something as mundane as my latest intranet reporting system has been completely altered by this experience. The whole user interface of listing entries in reverse chronological order with links for comments, and controlling this view through a calendar control and a category and blogroll list is a direct result of this experience. Some of the challenges that still need to be addressed, and the main focus of the BOF, was how to get this approach to scale past the current capabilities of weblog aggregators. My only additional thought is that, to do this, we need to move beyond lists and outlines to structures that can support a kind of extended and collective memory that we project out onto the internet and can then refer back to when we need to, using something like the next generation of google. See also: Thoughts from the Blogger BOF Future Vision - Tivo For Blogging Extended Blog Conversation PDC - Weblogging the Future of Conversational Software

(0) comments

Rule based workflow 

One of the problems that I had with the BPEL workflow specification standard that is being lead by Microsoft and IBM at Oasis is that it takes a structured view of workflow. If you look at what is being done in the Open Source community you see that many projects are taking a very different rule based approach based on pre and post conditions that provide for a much more loosely defined event based approach to workflow. Often things in the real world doesn't go as planned. This can make structured steps approaches very difficult to work with once things get off track. With a rule based system, activities are activated as their preconditions are met and released as their output is verified. The actual steps may be performed in a different sequence each time the work is performed. It was interesting for me to see that MS Biztalk 2004 has integrated support for both BPEL and rule based approaches. This is done by having a rule based block that can be embedded in a BPEL style workflow. Different parts of the workflow can be structured or rule based depending on the requirements.

(0) comments

InfoPath 

One of the most interesting products at the PDC in terms of the future direction and strategy of MS is InfoPath. InfoPath is a new Office product that will only be included in the Enterprise version of Office, which hasn't seen a new product in quite a while. Briefly, it is a product designed fresh, from the ground-up, to create and display XML information as well structured forms. It does this by generating a manifest which contains information on an integrated collection of files in industry standard formats. It is particularly well designed to work with Web Services. This makes it a good candidate for integration with Biztalk and SharePoint for human facing workflow enabled applications. Even more interesting than what it does, is what it says about MS's approach to computing in the future. The first issue is their approach to XML standards. XForms is the industry standard for working with XML forms data and InfoPath has been described by some as an XForms Killer. Here are some quotes from an article that compares InfoPath to XForms. XForms and Microsoft InfoPath
The InfoPath application is focused on providing a superb visual environment, of similar quality to the rest of the Microsoft Office System suite, for creating and filling out forms. In contrast, the XForms specification is designed to encourage implementations not to focus exclusively on visual media, but rather to define only the intent behind various data-gathering controls. The XForms specification gives implementations wide latitude in choices relating to how a particular form control can be implemented, though new CSS properties can narrow down the specific appearance of a form control. . . . InfoPath is built upon an impressive list of standard technologies, including WXS, DOM, and XSLT. For web developers modifying existing InfoPath content, such a design can be of great assistance. Other design decisions in InfoPath, however, tend to reduce the ability to use InfoPath with non-Microsoft browsers, platforms, or servers. For example, any investment in designing InfoPath solutions can be difficult to recoup in the face of changing to a different set of tools, no matter how standards-compliant they are. . . . Both InfoPath and XForms are version 1.0 efforts, and both are likely to improve substantially in future revisions. For organizations that have already licensed Office System 2003, InfoPath will provide an excellent means to automate data collection tasks. For use on systems not running Office System 2003, including Mac and Linux desktops, phones, PDAs, and even some PCs, XForms remains a better path.
In many other areas MS is doing a very good job of implementing XML standards, but then uses those standards in a way that makes life very easy for the user, but tends to tie them to an ever increasing suite of MS products. When more high level standards do not meet there requirements, MS has no problems creating their own solutions without waiting for the W3C or other industry standards approval.

(0) comments

DTS and Biztalk the two sides of machine facing workflow 

MS started with machine facing message based systems like Biztalk and heads down automated data crunching in SQL Server DTS and will now branched out into workflow support that can be integrated into all of its products (details not filled in yet -- see below). It is interesting to see how some of these different kinds of systems interact and overlap. The session on Biztalk and DTS integration was done as a kind of spoof competition between DTS and Biztalk and the kind of problems they can solve. Of course the main focus was on how they can be used together. For example there is going to be a Biztalk icon in the graphical DTS Workflow painter which can be used to configure incoming and outgoing messages. Here Biztalk would handle the messaging tasks and DTS would handle the heavy data processing transforms. There was also a discussion of how more human facing components could be added to the mix through email notifications and GUI application generated messages.

(0) comments

Wednesday, October 29, 2003

Workflow and XML moves to the center of MS architecture 

Workflow is increasingly being added to both human facing and machine facing systems. It is particularly interesting how they plan to break out the workflow components of Biztalk so that they can be integrated as a light-weight component in other applications, especially human facing systems like Office and Sharepoint. MS gave an interesting demo of a Workflow that was created in Visual Studio that could then be used to trigger Web Services from the action tab of a MS Word document to add data and allow document completion through drag and drops into form fields. The document could then trigger email messages and use sharepoint as a "smart" document repository. Some of the question that this raises, however, is it possible to live in a world were every component of you Office system isn't Microsoft. I raised this question and got the predictable answer (it depends). The other issue is that much of this technology is well in the future.

(0) comments

Microsoft makes dramatic enhancements to its XML support 

Microsoft has made the XML standard an integral part of virtually every aspect of its vast computing infrastructure. This includes everything from the lowest level APIs of the Longhorn Operating System, to the graphics system, to Office document formats to database access and workflow and integration strategies. But MS has increasingly gone its own way in how XML is processed. In its new XML support enhancements it has moved well beyond the DOM and SAX models that the rest of the world is using and XML started with. While the rest of the world is still trying to play catch-up to the XML pull parsing that MS is using to do SAX like processing, MS has now officially deprecated the use of DOM processing and is enhancing its XPath XMLNavigator technology to include update streams, that make the awkward DOM processing no longer necessary. It is also putting a lot of effort into integrating XML with traditional table views of data. Reporting Services is a good example of how through the increased use of XML in the background, users have to do less XML processing, but can instead use GUI tools where the actual XML is hidden behind the scenes. The same is true of the new InfoPath Office Support. I will try and talk more about this later. XML processing in all its tools is also much faster. One of the weak areas of MS XML processing has been its XSLT support. In the new Visual Studio the speed has been increased 400%. The editor has also been significantly enhanced and debugging has been added. It was all very impressive. The XQuery Integration is also very strong, with XQuery being capable of being used both within application programs against in memory documents and as a database query language.

(0) comments

The seventh level of geek 

Last night I went to the late night BOFs on XQuery and then even later the BOF on alternative languages. The Xquery BOF had a person from the W3C committee that created XQuery as well as one of the prominent book authors on the topic. From the W3C perspective, XQuery is as important as relational databases was to the birth of SQL. XQuery is not just seen as a query language for XML but as a "grand unifying model of data" that includes all the XML standards like XPath and XML Schema, but includes relational data structures as well. They were very proud of having developed a formal data model first and then used it as a basis for a coherent query language with well specified semantics. The use of XQuery as an important integration tool was also emphasized. At the alternative languages BOF there was a lot of talk about small language programming and the use of dynamic scripting languages. There were a lot of people from MONO and other open source projects there. There has been a lot of talk about how appropriate a strongly typed system like the java and .NET virtual machines are for more dynamic and functional languages like Ruby, Python, Eiffel etc. There seemed to be a lot of very inefficient casting operations that are required to go between the world of strongly typed and dynamic languages. There also seemed to be genuine MS interest on what can be done to bridge the gap.

(0) comments

Tuesday, October 28, 2003

Analysis Services is something I definitely want to start using 

While the changes in Analysis Services a.k.a OLAP are not as dramatic as in other areas, the integration with Report Services and Web Services is going to make them an increasingly important part of an efficient data delivery system. In the past data warehousing required a lot of maintenance and extra vendor products. But now it is a fully integrated part of the MS family of data processing components. It is going to dramatically increase the efficiency of queries that aggregate across a large database of detailed information. It also presents a very easy to use interface to data analysts, so that they are no longer limited to static reports generated in batch. They can use drag and drop from within an application or excel spreadsheet to slice and drill down through there data without having to learn SQL and understand all the details of a complex data model. I had some issues concerning the security of alternative architectures, but found some one to answer my questions except for a couple of things concerning XMLA which allows thin client (browser only) access to analysis services using web services and XML.

(0) comments

Report Services will result in a dramatic productivity increase 

The most useful thing that I have seen so far is the new "Reporting Services". MS has introduced a server based reporting system that will probably wipe out all the competition, including Crystal Reports. They have produced a server based solution that allows you to develop reports in Visual Studio and deploy them to a web server. From the server they can be access from a straight URL or as a Web Service in a variety of formats including HTML, Word, Excel, PDF. The report designer looked very powerful and generated a declarative XML report definition in "Report Definition Language" RDL . Things that are taking me days to do in XSLT can now be done in less than an hour.

(0) comments

Monday, October 27, 2003

Keynote 

Microsoft seem to be integrating XML into there new operating system at a depth that is even suprising to me. The entire user interface and the replacement for the OS win32 API, is being defined in a declarative XML language. XML is also being used to add metadata to all the object in the file system to allow some really interesting capabilities. This is being described as extending the base schema of the WinFS file system, but it isn't clear to me yet whether this refers to an actual W3C schema file or is a general reference to XML metadata. The way that a user can interact with the file system has also dramatically changed. The view of the system can be "self-organized" based on queried/navigated metadata and automatically captured relationship information to create dynamically generated views. This provides a very rich search capability that can span many different media types in an integrated "graph view". People are also given identities within the system and act as proxies for the collection of metadata and relationship data. Much of this information can be generated and automatically captured through drag-and drop of icons between dynamically generated views. This includes buddy lists peer to peer information sharing, faxing email, and the broadcasting of live screens captured from another user. For the end user this means that they can very easily search and display keyword like information that is either automatically or manually attached to files. For the developer, the file system can now be treated as a very rich metadata repository, where files can be manipulated with much of the power that was only possible in a database.

(0) comments

XML as money 

Another interesting analogy that has come out of the "Mediator Evolution" paper is the role of money in economic systems.
This problem was solved by the evolution of a single reference resource that functions as a mediator between the different demands: money. The functional equivalent of money has evolved independently in different historical civilizations. ... Thus the "invisible hand" of the market is actually a distributed control system or manager emerging from the mediator that is money and the medium that is the market-place. This manager coordinates, regulates and initiates interactions between agents of the collective so as to --theoretically -- maximize their synergy.
In many way XML may play a very similar role, as the medium through which data processing interactions can be coordinated. As XML standards form, this results in an implicit control system that constrains how data processing tools and organizations evolve. The untimate expression of this kind on control is something like Biztalk, which is an XML based workflow control system for the coordination of how an organization manages its internal processing of data as well as how it exchanges data with other organizations.

(0) comments

Sunday, October 26, 2003

MS XML Research 

Something Cool From Microsoft You Might Not See At The PDC

(0) comments

XML as a controlling medium 

One criticism of MS Word is that proprietary format changes are arbitrary introduced to require customers to purchase expensive upgrades with little added value. Others complained of feature blot where new features were added, but that were of little value to the vast majority of users. The ultimate expression of this policy was an attempt to move to a subscription policy where you were required to pay, even if MS added no value to the product. The counter argument is that while each user uses only a small set of features, each user uses a different set of features, resulting in the value of a product that can be used by everyone to do anything. In the new release of Word, there doesn't seem to be any new user features, but there seems to be a sea change in the possibilities now open to developers using MS Word as a new kind of application client. Previously, this kind of integration was not possible, because of the need to keep free riders from copying MS technology and stealing the market. Now that it has established a stable monopoly, it can provide open formats to allow a new level of integration without worrying about competition. Here XML provides the medium of interaction and integration . This relates to the idea of aspect systems, which if they use different resources, can co-evolve without competition. Aspects represent a functional invariance across subsystems. An aspect system that represents the possible interactions between the agents is a medium. Aspects that represent the aggregate properties of a system are referred to as the collective. The original COM based architecture of MS constrains the collective of developers to be locked-in the MS products. The medium of XML becomes a conflict mediator between different types of developers including java, SUN, IBM, Oracle etc. MS has tried to compensate for this by introducing its own developer tools for working with MS Office.

(0) comments

First Day At PDC 

Last night I read an interesting article by Francis HeyLighen "Mediator Evolution: a general scenario for the origin of dynamic hierarchies" and I thought it might be kind of interesting to apply some of his insights to what is going on at the PDC. Dynamic hierarchies are make out of components that "fit" together. This gets related to attractor dynamics. In general, finding these stable structures is a very difficult search problem, based on random interactions.
However, given the exponentially decreasing probability of encounter, even very small assemblies(more than three or so components) would seem to be highly unlikely. This is confirmed by the observation that practically all known chemical or physical reactions between particles, when analyzed at the most elementary level, involve the encounter of no more than two components.
Closure is related to the degree of freedom for properties. The main discussion centers around resource competition as it relates to asymmetries in conflict and cooperation. While the competition for globally accessible resources creates a zero sum game of conflict. Providing access to new or previously unavailable resources can provide an opportunity for cooperation. Therefore, you want someone with very similar goals and values but who consumes very different resources. This creates a selective pressure for differentiation. Asymetric synergy creates cultivator/crops. One important problem is the introduction of "free riders", that take but give nothing back.
The best person to collaborate with is one that supports you in achieving your objectives by bringing in resources such as talent, experience and connections that you doesn't have access to.

(0) comments

Tuesday, October 21, 2003

A new kind of software development 

Systems Software Research is Irrelevant - or is it?
If we take a page from what the biologists are doing , we may see a way for research in the software arena to go forward. The non-profit/charity/foundation model could be used to separate research from market cap. The only foundation that is doing this today is OSAF -- depending on how you look at Chandler, it is as much research project as PIM application. Surely there must be some people who still have .com money left that want to do something good for the world.
Progress, not profit: Nonprofit biotech research groups grow in size, influence

(0) comments

Friday, October 17, 2003

Library of Congress Luminary Lectures talk on Topic Maps 

Michel Biezunski and Steven R. Newcomb (Coolheads Consulting) gave a fascinating talk for the @your library Luminary Lectures series Topic Maps: The Inventor's Perspective on Subject-based Access Topic Maps are designed for the automated semantic merging of information (metadata) from multiple sources, into a single coherent navigational map, while maintaining the integrity of the source material. There are two forms of merging: name based and anchor (URN/URL) based. The example shown was the merging of IRS Tax document indexes, to provide an integrated and cross referenced call center response system. There also seemed to be a lot of interest in applying these principles to a broader merging of government information for the E-Gov initiative. Other applications included: cross-disciplinary science, homeland-security and credit card identity theft (prevention). The discussion ranged from practical considerations like using document headers and client email vocabulary to identify subject topics, to deep theoretical principles of identity and structure and how they relate to meaning. One of the interesting things about topic maps is that instead of first establishing a static identity and then adding characteristics, topic maps see identity as an emergent property determined by the accumulation of topic characteristics. These topics are merged based on a system of subject identity proxies. To support this capability, a graph model of linked topics is used, where ideally; every subject has only one proxy. This graph provides a "Map of the territory" that is carefully kept distinct from the "Territory to be mapped" (data sources). Another interesting hook, was that "to control information you need more information (metadata)". For me the highlight was at the lunch afterwords where Steve talked about the problems of reification in RDF as well as a debate about whether the basis of semantic meaning is in behavior or structure and identity. Even more productive was probably the walk back with Bryan discussing how a Service Based Architecture could implement Topic Maps using logic programming. One unoriginal possibility that occurred to me later was a link between Topic Maps and the structure preserving maps in the category theory of mathematics. Something, I know very little about.

(0) comments

Thursday, October 16, 2003

The next GUI 

[DonXML] Avalon Versus SVG-RCC Another Microsoft vs World horserace
So the way I see it, Avalon and SVG-RCC are going to go head to head for the next generation GUI architecture. We are still way early in the development life cycle and things can change, but no matter what it should be a very interesting adventure. If you thought HTML really change the way we develop apps, and think web services are cool, just wait until you get a hold of this new stuff. BTW, the XUL guys have a cool technology, but it misses the mark. It can’t handle custom content, and it doesn’t render to a vector graphics layer so you can’t abstract out the different platform graphics APIs.

(0) comments

Transforming History 

Book by William Irwin Thompson: Here is a quote on page 1
Like an image before us in the rearview mirror of a car, the picture of where we have been keeps changing as we move forward in space and time. The narrative of the past from even so short a time ago as the beginning of the twentieth century now no longer describe us, and thus each generation must reinvent the past to make it correspond to its sense of the present. In much the same way, the twentieth-century futurism was little more than a not very imaginative managerial description of the implications of its present.
page 4
In the Kyoto School of Nishitani, the East reconceptualized the West to show how the ultimate development of materialism led to nihilism. But it takes no mirror made in Japan to make us see that about ourselves, for we need only turn the pages of a history of Western painting to see the full story.
page 7
Both see the mental habit of the West as one in which Being is posted as a being and called "God"; in which process is arrested in substance and called "material reality"; and in which Mind is made into an organism without an environment and called "the self". For both Bateson and Valera, all three of these cultural activities are part of the same process of reification that isolates God from nature, mind from matter, and organism from the environment; and each ends up giving us a system of abstraction that we mistake for reality, to the destruction of both culture and nature.
Looks to be interesting -- only 190 pages to go.

(0) comments

Sunday, October 12, 2003

Topic Maps and Integration 

In light of some new integration projects I may be involved in and some talks I plan to attend at the Library of Congress [The Inventor's Perspective on Subject-based Access], I though it might be useful to try and distill my current thoughts on Topic Maps. In a traditional integration/business intelligence project data moves from raw data, to locally structured databases, to globally structured data warehouses, to olap dimensional models, to spreadsheet like analysis. Lately, I have been wondering what role Topic Maps might play in this process. The TM sweet spot seem to be the merging of heterogenious metadata (building a merged index from different sources). This also seems to be one of the critical problems with current systems. Merging different datasources into a datawarehouse, is one of the most difficult and costly parts of this process. Semantic merging has turned out to be much more difficult than the syntactic merging of relational join operations. However, semantics is usually defined as the grounding of symbols in the meaning expressed through use and behaviour. For me, one of the ironies of Topic Maps is that while they are focused on semantics, they have no direct support for any kind of behavior. Instead, Topic Maps use a well thought out graph-representation, grounded on a system of atomic identifiers. These grounding identities provide a static basis for the automated merging of metadata. In real systems, however, the merging process is much more subjective and labor intensive. The same kinds of information gets encoded in may different forms in many different systems. This requires a complex set of transformations to merge these perspectives into a common view for analysis. The transformations (behavior) attached to each of these different data sources must respect the semantics of each source, and it is the development of these transformations that is the heavy lifting of integration. What seems to be needed, is a dynamic system for describing and supporting the transformations that might guide manual semantic merging. Maybe a kind of XQuery for knowledge graphs that respect the constraints of topic map structure and identity. I understand that work on some kind of TMQL is well underway and I'll be interested to here what Michel Biezunski and Steven R. Newcomb have to say.

(0) comments

Thursday, October 09, 2003

XML Pipelines 

W3C XML Pipeline Definition Language Apache Jelly : Executable XML Sun XML Pipeline Definition Language Controller Implementation

(0) comments

The Role of XML Databases 

Phil Howard Bloor Research The demise of the XML database Mike Champion Reports of the "Demise of the XML Database" are dubious Kimbro Staken Reports of the "Demise of the XML Database" are dubious Why use XML?

(0) comments

Wednesday, October 08, 2003

Interesting discussion on content negotiation 

Should you use Content Negotiation in your Web Services? I currently use query parameters i.e. &format=xml or &format=excel I also try and set a consistent mime type and file extension if I’m downloading a file i.e. ContentType = “text/xml” AppendHeader(“Content-Disposition“,“inline;filename=MyFile.xml“) if it is being dynamically generated the actual URL is MyFilename.aspx (.jsp) However, Windows applications seem to use the file extension, if they directly open a file, so the mime-type gets ignored. For example, if I create a report as rich text format with a “.rtf” file extension, but give it a mime-type of “application/vnd.ms-excel” instead of “application/rtf“, the open/save dialog will tell me the content type is excel but then open it in MS Word. Guy

(0) comments

Tuesday, October 07, 2003

The drum beat for Office 2003 has started 

Special Report on Office 2003
XML capabilities in Word, Excel, and InfoPath help bridge the gap between desktop documents and databases, and give enterprises a reason to upgrade
Office 2003 perspectives
Another juicy corporate morsel: Office documents can now incorporate what's called XML code (extensible markup language). That statement may mean nothing to you, but it will give corporate geeks dilated pupils and sweaty palms. XML can tie together specially defined areas in ordinary Word and Excel documents with big, humming corporate databases. Fill in the blanks of the company expense report, and the company's SQL database can inhale and process it automatically.

(0) comments

Why XQuery? 

XQuery Implementation
Another reason for the low publicity level of XQuery is that so far there has been little evidence that XML data storage can become the ubiquitous technology that will unseat relational data storage. Although XML databases exhibit a lot of valuable and exotic features, they can be closely compared to their object oriented counterparts in terms of market penetration. Both seem to be very convenient for solving a set of specialized problems and both seem to do well with small and mid-size systems. However, as the complexity of the storage problem increases, neither XML nor object oriented stores appears to be able to scale as well as the relational solutions. Actually, doing as well as the relational solutions will probably not cut the mustard either. Only a quantum leap in technology would move the mountain of investment away from the RDBMS legacy. ... XQuery will find its spot under the sun in a land little anticipated by its authors. We are witnessing the era of inter-company software integration. ... The logical question becomes, then, what the best way to bridge two XML interfaces is? We need a language that is convenient for writing business logic and can natively manipulate XML data. This is where XQuery shines. It is a very expressive functional language with a simple, familiar syntax and an organic connection to XML data structures.

(0) comments

Monday, October 06, 2003

Class Object and Data 

Many people have observed a shift from the predominance of the "class" construct in object oriented programming to a new emphasis on data that is evident in the increasing use of XML and service oriented approaches to programming. With the development of many more XML standards, such as Schema/Relax NG/RDF there has been a return to a more constrained type system for controlling and validating data. This swing back and forth between the abstraction of classes and the concreteness of data leaves the concept of an "Object" as a good balance point on which to build a new model of programming. Instead of objects being created top down from the template of a class, classes can be defined bottom up, from a growing collection of objects and transformations that define the dynamic evolution of a class. Instead of a static definition of a class, there is a network of transformations between objects that defines there dynamic evolution. It is the “value” of these transformations that determine how there structure will change over time. Objects integrate syntax with semantics (value or meaning) as well as the creation of value over time (learning). Lev Goldfarb has developed a similar set of ideas in an attempt to define a new kind of mathematics that is linked to inductive learning. He refers to this model as an “Evolving Transformation System” or ETS.
The ETS model …”is the first, and so far the only, formal general framework for modeling an evolving form of object and class representations. As a radically new formalism, in contrast to the classical formalisms--which do not make any explicit assumptions about the structure of objects in the Universe--the ETS model explicitly postulates the generative structure of objects and classes.”

(0) comments

This page is powered by Blogger. Isn't yours?