Sunday, November 30, 2003
Wednesday, November 26, 2003
Monday, November 24, 2003
Saturday, November 22, 2003
Severe and well acknowledged problems of Workflow Management Systems stem from their rigorous and formal nature. Implementations of workflow tend to be coercive, isolationistic, and inflexible; whereas the natural interaction of people frequently incorporates flexibility, opportunistic behavior, social awareness, and compromise . . . . The application of contemporary workflow management systems is not always able to cope with ill-defined and unstructured environments. In practice, workflow technology often lacks flexibility, because it is trapped in a control flow paradigm. Workflows should not be driven by pre-specified control-flows but should be data- or information driven. -- Ijme Schilstra, BPM’03, EindhovenSee also: this Previous Entry
Wednesday, November 19, 2003links to this post (0) comments
‘Now’ is never just a moment. The Long Now is the recognition that the precise moment you’re in grows out of the past and is a seed for the future. The longer your sense of Now, the more past and future it includes. It’s ironic that, at a time when humankind is at a peak of its technical powers, able to create huge global changes that will echo down the centuries, most of our social systems seem geared to increasingly short nows. ... Since this act of imagination concerns our relationship to time, a Millennium is a good moment to articulate it. Can we grasp this sense of ourselves as existing in time, part of the beautiful continuum of life? Can we become inspired by the prospect of contributing to the future? Can we shame ourselves into thinking that we really do owe those who follow us some sort of consideration – just as the people of the nineteenth century shamed themselves out of slavery? Can we extend our empathy to the lives beyond ours? ... And what is possible in art becomes thinkable in life. We become our new selves first in simulacrum, through style and fashion and art, our deliberate immersions in virtual worlds. Through them we sense what it would be like to be another kind of person with other kinds of values. We rehearse new feelings and sensitivities. We imagine other ways of thinking about our world and its future.
Yesterday's post on Longhorn got a fair number of reactions. Most of the folks who left comments were indignant Macintosh users. I agree that today Mac OS X is probably a superior operating system to Windows XP. That's one of the reasons that I asked for a Mac OS X machine as my official OSAF box. I wasn't talking about today. I was talking about tomorrow.It's not about Mac vs Longhorn or Linux vs Longhorn
I have to say that I am impressed by the vision for Longhorn. It's not going to get us the Knowledge Navigator (sorry Scoble). I'm impressed with Microsoft's willingness to make such a risky play. Rewriting a huge amount of system functionality with new APIs in managed code is fairly risky. But if they succeed, they are going to end up with an environment that will be pretty nice to program in, and there'll be some cool features in there. Once they get everything into managed code, people working in predominantly unmanaged environments are going to be hard pressed to keep up. To me the real question isn't about Microsoft and Longhorn, it's about the alternative platforms, Linux and the Macintosh. The Macintosh is tough because Apple is basically saying "hey, just trust us to keep doing cool stuff". And they are doing cool stuff, there's a lot of nice stuff in Mac OS X. But let's be honest, most of this stuff is just NextStep dressed up a little bit nicer. We still have C/Objective-C/C++ at the core. We need more than that. Linux is even worse off. Now I love Linux, but when I compare the Longhorn story with the Linux story, I get scared. Look at things like this. Operating system kernels are commodity software. The interesting stuff is moving up the food chain.
Friday, November 14, 2003
But DRM also gives Microsoft added power in the computer and electronics industries, especially with the code portability Andy mentioned. Remember, IL ultimately makes .NET and Windows hardware independent, decreasing Microsoft's dependence on Intel and increasing its power over Intel -- the power to give and to take away. There are instances where Microsoft might want to move away from Intel. Redmond has not done a very good job of putting its software on large-scale servers, for example, largely because its hardware partner doesn't scale well. We're seeing Intel-based servers now with up to eight CPUs, but that's about it: Above eight the increased overhead means it isn't worthwhile, so we do clustering, instead. But now Microsoft is flirting with IBM precisely because IBM's Power architecture scales beautifully. If Microsoft wants to grab one of the last pools of profit it doesn't currently own -- high end corporate computing -- putting .NET on IBM's Power and PowerPC are a key.
It works at the bottom of the market, too, where IL's portability means Microsoft can drop old platforms and move to new ones primarily on the basis of compliant partner behavior. Imagine a game machine or a set-top box that didn't embrace Microsoft's DRM scheme. Well, it would be perfectly in keeping with Microsoft behavior to just drop those hardware platforms in favor of others that were more cooperative. Either buy Redmond's DRM technology (ultimately giving Microsoft the transaction business, remember) or Microsoft might ignore your platform into oblivion.
But what happens when Microsoft has all the computers, all the video games, all the set-top boxes, all the PDAs, all the mobile phones, when it has conquered the transaction business and holds all the money? That leaves only one more industry I can think of for Microsoft to enter that is profitable enough -- pharmaceuticals. Would you buy drugs from Bill Gates?
XUL is based on a set of Mozilla specific foundation objects. Java applets are based on a set of Java specific foundation objects. Flash programs are based on a set of Flash specific foundation objects. SVG programs are based on a set of SVG specific foundation objects. I just don't see how XAML threatens the "standards" of the web just because it is based on its own specific set of foundation objects. Posted by Invalid Assertion
My not-to-hidden agenda here is simple - dynamic applications should be dynamic on the client. The server should send data - either through web services, database access, or any other wire protocol - and the client should consume that data and generate UI. The model of a server trying to generate the correct display for a client is just broken. -- Chris Anderson (Avalon Core Developer)
XAML is a markup language that can be used on Longhorn for many things including creating desktop apps, web pages, and printable documents. -- Rob Relyea (Avalon Program Manager, XAML "Spec Owner")
What does this mean? It means that you could use XAML as the backing format for an interactive design tool that lets you design programming logic graphically. Or maybe it lets you design complex business orchestration scenarios. Or whatever. The designs could then be serialized to XAML and then compiled into CLR assemblies. X# anyone? Posted by Invalid Assertion
Yes, you can always ship your application/framework with a runtime compiler instead of a runtime interpreter and make it all the same. The question is does it make sense to do it. Downloading JScript as a source file and then interpreting it in the browser at runtime is a monstrosity. On the other hand, compiling markup that does nothing else than specify the layout and the appearance of your UI is a monstrosity too. You should use the right tool for the job. Procedural code should always be statically compiled, and it should be possible to interpret declarative markup at runtime.
I did not get myself very familiar with the new XPathDocument class, but I think this class will allow writing XML directly into a document instead of using the DOM methods. It should be the same with XAML. Also, it should be possible to navigate the Element tree using XPath. For example: Button b = dockPanel.Item(“xpath_expression”) This, together with the DHTML style editing, would be the killer features of XAML. Posted by George Mladenov
Wednesday, November 12, 2003
It is nice to see Rob Relyea responding to my Avalon question. And to see Mike Deemback in the blogosphere and responding respectfully to Joe Gregorio's rant. And John Lam categorizing the Ant communityreaction to MSBuild.
What I believe we are seeing is domain experts seeking each other out. Crossing organizational and philosophical boundaries.
Tuesday, November 11, 2003
Sunday, November 09, 2003
- Don't do it if you're not prepared, as an organization, to speak the truth. Blogging promotes, but also requires to some extent, a culture of candor.
- Start small.
- You need an organizational leader to set an example.
- Set up the infrastructure, buy licenses, etc. early and make it easy for people to get started. I bought Radio licenses for everyone. You could choose Moveable Type as well, but I think Radio has some definite advantages: (a) it includes the hosting that's outside the organization and (b) it brings a sense of user ownership since it lives on the desktop.
- Set a few guidelines (to show you've thought about them) but don't set too many since they will stifle people's creativity.
- Create an aggregator that reads the RSS feeds from the various blogs and presents them for people who don't use blogs and aggregators. Point at it and reference it whenever you can to drive traffic to the information in the blogs.
- Enable comments to encourage participation through feedback and interaction from those who don't write blogs.
- Be prepared for some people to be very threatened and offended when you speak the truth. Be proactive in preparing the people who they'll complain to so that they understand what you're doing and what the goals are.
- Pick out two or three people who like to write and give them special encouragement to get their blogs going. Meet with them often and form a "support group" of sorts to get things going.
Thursday, November 06, 2003
At the PDC, I was in charge of learning what was Indigo about, so I went to all Indigo sessions. Most of Don Box's sessions where more about philosopy (justifying MS's shift to the SOA model) than about technology (just have a look at his slides to see what I mean), but quite interesting anyway. Other sessions by Doug Purdy and Steve Swartz where more technical and more useful for understanding Indigo's architecture.
The first thing to say about Indigo is that it does not contain revolutionary technology. It just takes the best of ASMX, WSE, Remoting, Messaging and Enterprise Services to create a new framework for building enterprise applications based on a Service Oriented Architecture model.
However, the great value of Indigo is that it integrates in a coherent way all those diverse technologies, by providing a common communication stack, common security features, transaction management and extension mechanisms.
What if the JRE shipped with XUL support built in, eh? Or if there was an XML syntax to create SWT components. The whole Java UI design methodology harkens back to X toolkit coding, programatically creating widgets and layout controllers. The closest we have is Jelly.You might wonna check out the Luxor XUL Toolkit @ http://luxor-xul.sourceforge.net for creating Swing or SWT widgets using XML. For more alternative XUL motors/browsers/runtimes in Java check out the XUL Alliance @ http://xul.sourceforge.net
I am also going to observe that it is a shame that Sun haven't got a good Java alternative.Well, Sun hates XUL because evidently it competes head-on with the upcoming Java Server Follies (JSF) package. See the Viva site for more insight @ http://viva.sourceforge.net/action.html Also note that Sun has a XUL alternative in vapourware state called Java Desktop Network Components (JDNC). See the XUL News Wire story for more insight @ http://article.gmane.org/gmane.comp.lang.xul.announce/29
Wednesday, November 05, 2003
WhiteHorse: The most fascinating new feature I have seen from a deployment issue is WhiteHorse. It is not a builds program like Msbuild, but more importantly it is a modeling/framework using policies and gui representations of the logical and physical layers of your enterprise. Whitehorse can enforce these policies on a developer box. Most enterprise deployments encounter severe bugs at deployment because of the developers ignorance of their final environment. I have overcome these issues by completely understanding the deployment environment and then reproducing that environment on in my developer environment. That can manifest itself as many machines to gain physical separation (firewall issues and remoting realities), developing with developer tool on the actual platform server OS(such as Windows 2003), and enforcing authentication and authorization representing the deployed environment. Well I would like to have all of that on my portable laptop. Yes my lap top would weigh 85lbs and cost 10% of my yearly income, maybe 20% some years. I usually try to mitigate the cost of multiple machines using VM Ware, but that also demands many hard drives to get any kind of performance at all. You try to carry three hard drives on your back. If WhiteHorse can deliver I will save a lot of money, not to mention a nice pretty diagram that integrates in the IDE and helps me compel fellow developers to think about the platform they are developing for.
"Longhorn" (code name): The next major release of the Microsoft Windows operating system. This is slated for release around 2006 (possibly Q3/Q4 of 2005) if the beta testing goes well.
"Aero" (code name): Longhorn 3D-rendering user interface
"Avalon" (code name): Graphics presentation technologies in Windows "Longhorn" that provides a unified architecture for presenting user interface, documents and media in the system.
ClickOnce: Technology in Windows "Longhorn" designed to speed and simplify deployment of applications.
"Indigo" (code name): .NET communications technologies in Windows "Longhorn" designed to build and run connected systems.
"Palladium" (code name): Microsoft's Next Generating Secure Computing Base (NGSCB) secure-OS subsystem that will debut in Longhorn
SuperFetch: Technology in Windows "Longhorn" designed to help applications launch more quickly.
"Whidbey" (code name): Next generation of the Microsoft Visual Studio system of software-development tools.
"WinFS" (code name): Next-generation search and data storage system that provides a unified storage model for applications running on Windows "Longhorn."
WinFX: Programming model Windows "Longhorn," used by software developers to build applications for the Windows platform. This is a superset of the current .NET programming model.
"Yukon" (code name): The next generation of Microsoft SQL Server database software - expected to ship second half of 2004.
So, why did I bother to write this? Good question. Sometimes I read these rants about how open source automatically obtains magical god-like quality. It's true in a twisted way: The ten open source projects that religious open source advocates will choose to point to probably have better quality than the average non-open-source product. My advice is to look at open source with an open but critical mind, the same way you should look at non-open-source products. Don't get caught up in the religion, because in the end, there's still poison in that kool-aid. Open source can benefit us all -- and I believe that it has benefited and continues to benefit us all. Let's not ruin it by attributing magical religious properties to it. Use it if it works for you. Contribute if you can and you want to. Report back bugs to help it improve. Ask for useful features. Be thankful for it being there. Just don't lie to yourself and others about attributes that open source does not intrinsically have. It still takes a strong vision, continued focus, great engineers, hard work and long-term dedication to make good software. There simply is no silver bullet.
Yes, the CLR and C# level the playing field so people who like case-sensitivity and curly-braces aren't bogged down in tedious detail, but VB is still a big deal and honestly has at least one killer feature that C# and C++ still lack. I'm reminded of the following verse which still has relevance today: My language, 'tis of thee, Sweet grammar of simplicity, Of thee I sing. Language where new things are tried, Language of Bill Gates' pride, From every mountainside, let VB ringMore MS humor
Hey. I'm not a newbie or a script kiddie. And I don’t get paid less than other programmers, or only work on UIs, or write email viruses. And I don't know BillG, SteveB or EricR from Microsoft, although I'm certain they're really, really nice. I have a Handles clause, not a += expression. I speak VB and VBA, not C#. And I write it 'For Each,' not 'foreach.' I can proudly type names using any capitalization I like. I believe in line continuations, not curly braces; colons, not semicolons. And that ‘With’ is a truly proud and noble statement. CType is a cast, ReDim IS a statement. And it IS written ‘Mod’. Not ‘%’. ‘Mod’. BASIC is the oldest Microsoft language, the most popular programming tool in the world and the best part of Visual Studio. My name is Mort, and I AM A VB PROGRAMMER!
Tuesday, November 04, 2003
- There is a concept of Items and relations between Items
- An Item is something like a file, a folder, a contact, or a user defined type
- There are two types of relations: holding relationships and reference relationships
- With holding relationship you can model the well known folder hierarchy, but also much more because an item can be "hold" in more than one relationship (although cycles in the "hold" graph are not permitted)
- An item has a reference count of its holding relationships. When it has no more references it is removed from the store.
- Security is on Item level (which could indicate that Yukon, presumably the storage engine in a slimmed down version, has row level security)
- Every store Item has an unique ID that identifies the Item in the store (although not guaranteed over multiple stores)
- WinFS supports: Asynchrony (multiple queries and data retrieval at the same time), Transactions, Notifications (on Item changes, relation changes, and many more subtile things going on), Blob/stream support, Cursoring and paging (a query can result into many Items!)
- A powerful API is available to interogate the store
- Within the API powerful querying and filtering functionality is available
- Also direct SQL queries can be executed on the store for really powerful aggregation and grouping
Mercury is a new logic/functional programming language, which combines the clarity and expressiveness of declarative programming with advanced static analysis and error detection features. Its highly optimized execution algorithm delivers efficiency far in excess of existing logic programming systems, and close to conventional programming systems. Mercury addresses the problems of large-scale program development, allowing modularity, separate compilation, and numerous optimization/time trade-offs.Mercury on .NET
Around 18 months ago, Microsoft approached the Mercury researchers at the University of Melbourne regarding an opportunity to participate in a research and development effort involving a multi-language platform, under the banner "Project 7". As well as offering access to early development tools, Microsoft offered a substantial grant to fund researchers, an opportunity to stay at Microsoft at Redmond for 3 months to learn the technology (alongside researchers representing other efforts), visits to other implementors working on other languages (e.g. Haskell), and ongoing support, technical conferences, and a stream of snapshots of ongoing work. This work was to remain secret (under a Non-Disclosure Agreement) until they were willing to announce their new platform and reveal its details. They asked for no special intellectual property rights, and indeed except for NDAs no contracts were signed. However, they requested that we provide feedback on what would make the platform better for Mercury, and what criticism (or praise) we (as language designers and implementors) could give on their work. We decided to accept their offer, for two main reasons. First, we have always tried to make Mercury available to the largest possible number of programmers compatible with our means. Distributing implementations on Unix-family platforms reaches large numbers of people, but there are even larger numbers whom it does not reach. The overwhelming majority of these people work on Microsoft platforms and need an implementation that works natively on that platform (i.e. not via systems such as Cygwin). By building a backend that targets .NET we can reach this audience. Second, we have long planned to implement a new backend that generated higher level imperative code than the existing backend (which basically generates assembler in C syntax), and have had fairly detailed designs for this backend. Unfortunately, we had no money for implementing this design, because this kind of work is very hard to get research funding for. The grant from Microsoft has allowed us to build a generic backend that can generate high level C code as well as code for the .NET platform. We expect that in the future it will also generate code for the JVM. (We have received a significant number of requests for a JVM backend.) The vast majority of the work for this project was conducted by Tyson Dowd and Fergus Henderson. Several other people at the University of Melbourne had signed the NDA, however many of the Mercury researchers did not as we saw no point in encumbering the entire group with such legal restrictions. As of the Microsoft Professional Developers Conference last week, this NDA has been lifted, and we can now talk about what we have been working on and the status of this project. This doesn't mean Microsoft is going to ship a product right now -- the development kits that have been distributed are pre-beta releases. But we can talk about it and people using the pre-betas can try out our work.
Paul Vick added that VB.Net will not expose anonymous methods in the Whidbey release, primarily because Microsoft is reluctant to introduce this concept to VB programmers, many of whom are still getting accustomed to the OO concepts introduced in the current VB.Net. I found this very interesting, because it's the most significant example I can think of where VB.Net and C# make such a departure from one another on a core language feature. I'm not totally convinced this is a great idea. VB's watchword has always been productivity, whereas C++ has focused on power. C# is somewhere in between, but leaning, in my opinion, toward productivity. So, if a feature is good for C#, wouldn't it also be good for VB.Net? What does the decision to omit anonymous methods say about the future of language parity in the Microsoft-supported .Net languages?
Monday, November 03, 2003
As such, the web would turn into an intelligent, adaptive, self-organizing system of shared knowlege, structured in a much more flexible and intuitive way than the formal classification schemes conceived by Berners-Lee and others. Unlike material resources, knowledge and information do not diminish by being shared with others (economists call this property non-rivalry"). Since the learning web would make this sharing effortless and free, this enables a positive-sum interaction in which everyone gains by making their individual knowledge and experience available to others, this provides a continuing incentive for further cognitive integration. The web plays here the role of a shared memory, that collects, organizes and makes available the collective wisdom. It achieves this without demanding anything from its users or contributors beyond what they would have had to invest if they were working on their own, thus removing any incentive for free-riding.