Tuesday, October 31, 2006
Genetic algorithms instruct sophisticated biological organization. Three qualitative kinds of sequence complexity exist: random (RSC), ordered (OSC), and functional (FSC). FSC alone provides algorithmic instruction. Random and Ordered Sequence Complexities lie at opposite ends of the same bi-directional sequence complexity vector. Randomness in sequence space is defined by a lack of Kolmogorov algorithmic compressibility. A sequence is compressible because it contains redundant order and patterns. Law-like cause-and-effect determinism produces highly compressible order. Such forced ordering precludes both information retention and freedom of selection so critical to algorithmic programming and control. Functional Sequence Complexity requires this added programming dimension of uncoerced selection at successive decision nodes in the string. Shannon information theory measures the relative degrees of RSC and OSC. Shannon information theory cannot measure FSC. FSC is invariably associated with all forms of complex biofunction, including biochemical pathways, cycles, positive and negative feedback regulation, and homeostatic metabolism. The algorithmic programming of FSC, not merely its aperiodicity, accounts for biological organization. No empirical evidence exists of either RSC of OSC ever having produced a single instance of sophisticated biological organization. Organization invariably manifests FSC rather than successive random events (RSC) or low-informational self-ordering phenomena (OSC).
Sunday, October 29, 2006
In my talks on Web 2.0, I always end with the point that "a platform beats an application every time." We're entering the platform phase of Web 2.0, in which first generation applications are going to turn into platforms, followed by a stage in which the leaders use that platform strength to outperform their application rivals, eventually closing them out of the market. And that platform is not enforced by control over proprietary APIs, as it was in the Windows era, but by the operational infrastructure, and perhaps even more importantly, by the massive databases (with network effects creating increasing returns for the database leaders) that are at the heart of Web 2.0 platforms. But as Bill notes, this doesn't mean the end of the application category. It just means that developers need to move up the stack, adding value on top of the new platform, rather than competing to become it.
Tuesday, October 24, 2006
Google apparently has responded by replicating everything everywhere. The system is intensively redundant; if one server fails, the other half million don't know or care. But this creates new challenges. The software must break up every problem into ever more parallel processes. In the end, each ingenious solution becomes the new problem of a specialized, even sclerotic, device. The petascale machine faces the peril of becoming a kludge. Could that happen to Google and its followers? Google's magical ability to distribute a search query among untold numbers of processors and integrate the results for delivery to a specific user demands the utmost central control. This triumph of centralization is a strange, belated vindication of Grosch's law, the claim by IBM's Herbert Grosch in 1953 that computer power rises by the square of the price. That is, the more costly the computer, the better its price-performance ratio. Low-cost computers could not compete. In the end, a few huge machines would serve all the world's computing needs. Such thinking supposedly prompted Grosch's colleague Thomas Watson to predict a total global computing market of five mainframes. The advent of personal computers dealt Grosch's law a decisive defeat. Suddenly, inexpensive commodity desktop PCs were thousands of times more cost-effective than mainframes. In this way, the success of the highly centralized computer-on-a-planet runs counter to the current that has swept the computer industry for decades. The advantages of the new architecture may last only until the centripetal forces pulling intelligence to the core of the network give way, once again, to the silicon centrifuge dispelling it to the edges. Google has pioneered the miracle play of wringing supercomputer performance from commodity CPUs, and this strategy is likely to succeed as long as microchip progress remains in the doldrums. But semiconductor and optical technologies are on the verge of a new leap forward. The next wave of innovation will compress today's parallel solutions in an evolutionary convergence of electronics and optics: 3-D and even holographic memory cells; lasers inscribed on the tops of chips, replacing copper pins with streams of photons; and all-optical networks in which thousands of colors of light travel along a single fiber. As these advances find their way into an increasing variety of devices, the petascale computer will shrink from a dinosaur to a teleputer – the successor to today's handhelds – in your ear or in your signal path. It will access a variety of searchers and servers, enabling participation in metaverses beyond the ken of even Ray Kurzweil's prophetic imagination. Moreover, it will link to trillions of sensors around the globe, giving it a constant knowledge of the physical state of the world, from traffic conditions to the workings of your own biomachine.
Friday, October 06, 2006
Also, on the Internet, as I mentioned before, Web 2.0 refers to browser applications... in the Enterprise, it doesn't have to be browser only if you consider my definition of web 2.0 which is about transforming users into participants. It has to be easy so people can interact... and this includes familiar applications like Office. If it's easy to consume and publish content from Office, that makes it even more attractive. Examples in the latest 2007 Office system include:
1. Blogging: blog from Word 2007 or the web w/ SharePoint blogs.
2. PowerPoint Slide libraries: View PPT slides in the browser.. or consume slides from PPT 2007
3. Excel Services: Host spreadsheets on a server and make it visible through a browser... publish the spreadsheets from Excel 2007.
4. Access and interact w/ documents, contacts and tasks via a browser... take them offline with Outlook 2007
So why care about the client?
1. While "Web 2.0" browser applications are client-like by using AJAX dev techniques, client apps are still richer... client apps are also generally more flexible w/ less storage and security restrictions... .and generally tend to be faster. Caveat: on the performance and storage front, I do see advances in services, grid computing and storage technologies to lead to even greater improvements in the next few to many years.
2. Offline. When you're offline, you need to be able to be productive and create/use content.
3. Rich Clients are the ultimate mash ups. Part of the point of mash-ups is to surface new functionality within an applicaiton a user already knows and loves.
Don't get me wrong - web applications are equally important! But Web 2.0 when taken to the Enterprise isn't just about web apps. Some organizations want to reduce the manageability costs so they try and move to a central model... but for the reasons outlined above, it's simply not practical.
You probably know where I'm going with this... 2007 Office system (with SharePoint technology as a foundational pillar) provides an unparalleled "Web 2.0" business productivity platform with out of the box solutions and a platform to develop and host custom apps. Examples of out-of-the-box "Web 2.0" applications with SharePoint technology include:
1. Blogs and Wikis
2. Team Workspaces (Collab)
4. PPT slide libraries and Excel Services
5. Web Content Management
Needless to say, RSS is everywhere; users can create custom apps with SharePoint Designer; self-service is an extremely important underlying theme. If you take a look at the core tenets of Web 2.0 software I mention above for TDM/BDMs to consider, SharePoint technology addresses all of them.
For IT Professionals, this platform is a unified platform for different apps -> one backup/restore, one deployment, one management story... with the right tools to control and manage different users, application and information.
So the next time your company is thinking about Web 2.0, you should really take a look at WSS v3/MOSS... and see how 2007 Office system/SharePoint technology offers a Web 2.0 platform and business productivity solutions that can be accessed via a browser or interacted with using the Office client transforming your users into active participants.
Wednesday, October 04, 2006
I see two ways to rationalize this. The first via philolinguistics. It could be that the very definition of innovation in the world of computing includes the idea of suceeding in the face of some constraints. After all, this is (for the most part) a real-world engineering discipline right? You know the drill: "time, cost, quality. Pick two." We eat and drink constraints in engineering. It is what makes engineering, engineering. Put another way, if there are no constraints, it ain't engineering. I can buy that. The second way to rationalize it is the apparent truism that necessity is the mother of invention. Ingenuity appears to be boosted by the sort of strictures that plague our daily engineering lives. At least it is in the case of those engineers that posess that most precious of all engineering personaility traits - a "can do" problem-solving mentality. So it would appear that constraints are good. This leads, of course, to the curious phenomenon whereby the very things that cause innovation to happen are the things that the innovators themselves moan about the most. I sure wish I had a supercomputer under my desk. I sure wish I had 4 gig of RAM, twice as many pixels, a faster hard disk, a quicker internet connection... And now comes my horrible, tentative conclusion. The best way to foster innovation in an engineering crew is to give them a set of constraints. If they are not moaning about the constraints, keep adding them until they start moaning. Then you have created a wellspring for innovation to happen.