<$BlogRSDUrl$>

Tuesday, June 29, 2004

Walkthrough: Smart Client Data in Visual Basic 2005 

MSDN
This walkthrough demonstrates several new features in Visual Studio 2005 to assist in developing applications that access data.

links to this post (0) comments

New Software Test Automation Book 

Effective Software Test Automation: Developing an Automated Software Testing Tool
Whatever its claims, commercially available testing software is not automatic. Configuring it to test your product is almost as time-consuming and error-prone as purely manual testing.

There is an alternative that makes both engineering and economic sense: building your own, truly automatic tool. Inside, you’ll learn a repeatable, step-by-step approach, suitable for virtually any development environment. Code-intensive examples support the book’s instruction, which includes these key topics:


links to this post (0) comments

Using Annotations with a Typed DataSet 

MSDN
Annotations enable you to modify the names of the elements in your typed DataSet without modifying the underlying schema. Modifying the names of the elements in your underlying schema would cause the typed DataSet to refer to objects that do not exist in the data source, as well as losing a reference to the objects that do exist in the data source. Using annotations, you can customize the names of objects in your typed DataSet with more meaningful names, making code more readable and your typed DataSet easier for clients to use, while leaving underlying schema intact.
See also: Navigating Multiple Related Tables in an ADO.NET Dataset

links to this post (0) comments

Sunday, June 27, 2004

Categories of knowing 

Or explaining what went wrong (think testing)

via Rattlesnake

  1. Unknown unknowns: a five year old child does not know that he does not know calculus. For him or her, calculus is an `unknown unknown'.
  2. Known unknowns: an older child know he that he does not know calculus. For him or her, calculus is a `known unknown'.
  3. Unknown knowns: a high school student starts to study calculus. He or she does not yet understand calculus. For him or her, it is an `unknown known'. (You will note that the focus of the first `unknown' shifts here from lack of knowledge of the general to lack of knowledge of the specific.)
  4. Known knowns: after learning calculus, it becomes a `known known': it is both known to exist and understood.
  5. A part of life: after learning calculus and using every day, a person may forget how much he or she knows.

links to this post (0) comments

Wednesday, June 23, 2004

Deploying and Administering Reporting Services 

MSDN

links to this post (0) comments

Writing software 

scottraymondnet
A well-written program precisely conforms to the shape of its problem space, the way a key fits the shape of a lock, no more and no less. And you see it, and it is good.

I suppose the dual satisfaction, from the utility and the aesthetic balance of a creation, happens in lots of engineering disciplines. But I suspect that it’s more common in writing software, because computers provide something close to a perfectly closed world: everything is malliable, every state is discrete, every object is abstract ... Come to think of it, that’s probably the same reason that mathematics is satisfying, at least for those who are able to reach a level where they can be creative with math.


links to this post (0) comments

Tuesday, June 22, 2004

Smart Client Architecture and Design Guide 

MSDN
Summary: This guide gives you prescriptive guidance on how to overcome architectural challenges and design issues when building smart client solutions. It also provides guidance on how to combine the benefits of traditional rich client applications with the manageability of thin client applications.

links to this post (0) comments

Sunday, June 20, 2004

Light weight container patterns 

Brand new lightweight containers are coming fast

The Data Loader Object design pattern: Lightweight Containers, Inversion of Control, Abstract Factory

Provider Design Pattern, Part 2

Provider Model Design Pattern and Specification, Part 1

Whidbey Provider Design Pattern pitfalls

Inversion of Control Containers and the Dependency Injection pattern

Lightweight Containers and Plugin Architectures: Dependency Injection and Dynamic Service Locators in .NET


links to this post (0) comments

Integrating Analysis Services with Reporting Services 

Sean Boon Microsoft Corporation

See also: Printing Reports Programmatically Using C# and SQL Server 2000 Reporting Services


links to this post (0) comments

Java and the next-gen software development platform 

How many languages do you use?
How many languages do you use?  

I love asking Java developers the following question, "How many languages do you use when developing a J2EE application?" They almost always look at me with this pitiful look. It's the, "Jeff is so old that he doesn't even know that J2EE uses Java" look. I'll usually string them along while they try to describe what J2EE is. The answers are great.

After they finish 'educating' me on Java, I follow up with the following set of questions:
1. Is HTML a language and do you ever use it on your J2EE applications?
2. Do you use a 'batch' or 'make' language on your apps?
3. Do you ever use the 'structured query language' (SQL)?
4. Do you ever use JavaScript?
5. Do you ever use JSP?

Usually I stop there - by this time they understand my point.

J2EE is a combination of languages and libraries. We use Domain Specific Languages (DSL) all the time. In the Java world we tend to wrap them with 3-letter acronyms that start with the letter 'J'. We often glue the our languages and libraries together in the form of an application by using an object oriented 3GL (Java).

In many cases, our libraries (JMS, JDBC, JNDI, etc.) merely act as a standardized API to a server (or set of services). These services often need to be customized. We have several ways to customize services: upload metadata, pass in parameters, etc.

J2EE is our container of DSL's and libraries. It acts as the vehicle to pass information from one DSL to another DSL, from one library to another library, from a DSL to a library or from a library to a DSL. The Java language is used to type the data, transform it, perform Boolean logic and pass the data on.

The questions that I've been asking myself are:
What should a library/DSL integration language look like?
Is Java (or any OO-3GL) the best fit for a library/DSL integration language?
What common functionality of the DSL should refactored into a generic 'DSL engine'?
How does a platform architecture provide more consistent extensibility mechanisms across the suite?
To what extent should the components of the platform be mandated to have a symbiotic relationship with the other members?

Back to the original question, "How many languages do you use?" The answer is usually 5-10, and the number is growing. Writing J2EE applications is becoming less about programming in Java and more about mastering domain specific languages, libraries and integrating them across contextually bound Use-Cases.

J2EE has grown organically over the last 7 years. It has not had the opportunity to be massively overhauled. In my humble opinion, the enterprise software community is ready for a massive refactoring of J2EE; one that isn't dragged down by backward compatibility issues. I believe that we are ready to incorporate our lessons learned to design the next-gen software development platform.


links to this post (0) comments

Some Thoughts on Joel Spolsky's Microsoft Losing the API War 

Dare Obasanjo

Secondly, he [Joel] argues that Microsoft is trying to force too many paradigm shifts on developers in too short a time.  First of all, developers have to make the leap from native code (Win32/COM/ASP/ADO) to managed code (ASP.NET/ADO.NET) but now Microsoft has already telegraphed that another paradigm shift is coming in the shape of Longhorn and WinFX. Even if you've made the leap to using the .NET Framework, Microsoft has already stated that technologies in the next release of the .Net Framework (Winforms, ASP.NET Web Services) are already outclassed by technologies in the pipeline (Avalon, Indigo). However to get these later benefits one not only needs to upgrade the development platform but the operating system as well. This second point bothers me a lot and I actually shot a mail to some MSFT VPs about 2 weeks ago raising a similar point with regards to certain upcoming technologies. I expected to get ignored but actually got a reasonable response from Soma with pointers on folks to have followup discussions with. So the folks above are aware of the concerns in this space. Duh!

The only problem I have with Joel's argument in this regard is that I think he connects the dots incorrectly. He agrees that Windows programming was getting too complex and years of cruft  eventually begins to become difficult to manage.  He also thinks the .NET Framework makes developers more productive. So it seems introducing the .NET Framework was the smart thing for Microsoft to do. However he argues that not many people are using it (actually that not many desktop developers are using it) . There are two reasons for this which I know first hand as a developer of a desktop application that runs in the .NET Framework (RSS Bandit)

  • The .NET Framework isn't ubiqitous on Windows platforms
  • The .NET Framework does not expose enough Windows functionality to build a full fledged Windows application with only managed code.

Both of these issues are why Microsoft is working on WinFX. Again, the elepahant in the living room issue is that it seems that Microsoft's current plans are fix these issues for developing on Longhorn not all supported Windows platforms.


links to this post (0) comments

Constructive Destruction - State of J2EE  

Vinny Carpenter's Blog
It's been a really interesting journey in the Java Enterprise space the last couple of years. Starting with Servlets and Enterprise JavaBeans, J2EE has gone through quite an evolution and has emerged into a robust and powerful Enterprise application development platform.

. . . I have been thinking a lot about this as I read 'The Innovator's Dilemma' by Clayton M. Christensen. The Innovator's Dilemma is a great book that postulates that 'disruptive technologies' enter the marketplace and eventually evolve and displace the current reigning technologies and companies.

. . . We can draw parallels with the theme of the book and the current state of the Java/J2EE space - As the Java platform has matured, the complexity has also increased. I typically spend almost 5-10 hours a week working with people in resolving class loading, packaging, war/ejb/ear descriptors or other related activities that is not related to the core business functionality that the application is implementing. As the complexity of the platform has increased, new technologies have emerged to help simplify the platform.

. . . Don't get me wrong - I still think J2EE is a great and viable solution but I am starting to see how 'disruptive' technologies like JDO, Hibernate and Spring among others are making people rethink how they design and implement solutions.

The idea of 'Inversion of Control' or 'Dependency Injection' revolves around the use of lightweight containers (Spring) that help to assemble components at run-time into a cohesive application without wiring them together in the traditional sense. While IOC is not a new concept, it is changing how we approach thinking about application development.

Hibernate and JDO are other examples of how they are changing the way we look at persistence and O/R mapping. While Sun and the J2EE licensees continue to support CMP and Entity beans, people are moving on and using things like JDO, Hibernate or rolling their own persistence layer to get the performance and flexibility that CMP promised, but never really delivered.


links to this post (0) comments

Wednesday, June 16, 2004

Objects vs. XML in WinFS Land 

Dare Obasanjo
. . . The first question about why WinFS doesn't build on XML, XQuery and XSD instead of items, OPath and the WinFS schema language is something that the WinFS folks will have to answer. Of course, Jon could also ask why it doesn't build on RDF, RDQL [or any of the other RDF query languages] and RDF Schema which is a related question that naturally follows from the answer to Jon's question.

The second why would one want to program against a Person object when they have a element. This is question has an easy answer which unfortunately doesn't sit well with me. The fact is that developers prefer programming against objects than they do programming with XML APIs. No XML API in the .NET Framework (XmlReader, XPathNavigator, XmlDocument, etc) comes close to the ease of use of programming against strongly typed objects in the general case. Addressing this failing [and it is a failing] is directly my responsibility since I'm responsible for core XML APIs in the .NET Framework.


links to this post (0) comments

Tuesday, June 15, 2004

Wanton coupling 

Loosely Coupled weblog
The suckline is the decision to use the lowest common denominator capabilities across participating nodes in order to [achieve] guaranteed communications." The suckline, by this definition, is the lowest level of shared context that still permits interaction.

But of course in an ideal world, no one wants to operate at the suckline — hence its name. So Jeff posits the notion that, "Web services are about protocol negotiation." The suckline becomes merely an opening gambit, an entry point that allows participants to discover their common ground and then step up to a deeper, more tightly coupled level of interaction: "If the protocol negotiation is done correctly you should be able to use a metadata-described invocation and interface description mechanism that is linked to a protocol policy for runtime resolution," which then allows the participants to step up to a "Greatest Common Factor (GCF)" level of interaction.


links to this post (0) comments

Monday, June 14, 2004

The role of intelligent agents 

I have been talking to Bryan about what it would take to integrate intelligent agent technology with weblogging. It seems that weblogs are creating a tremendous volume of metadata that could be mined to produce a number of useful automated services.

However, upon further reflection, I realized that we already have a lot of automated agents mining metadata. It’s called SPAM. So if we don’t want to add to the exponentially growing mass of spam, which now includes weblog comment spam, what do we do? What is the legitimate role of an intelligent agent?

I think this plays into a general theme that has come out of the diagnosis of the AI crash in the 1990s. At least for the present, machines can’t produce an autonomous intelligence, but are an extension and projection of human intelligence. The appropriate role of an intelligent agent then, is to compensate for the limitations of human cognition to allow human intelligence to expand into areas that were not previously possible.

Spam is one such extension, although most would say an inappropriate extension. I was telling Bryan that my weblog got little attention, because I didn’t advertise it by commenting on other people’s weblogs. It was mostly an extended memory, that I used to keep track of things. As he put it, I wasn’t part of a community. So, how could agent technology be legitimately used to extend my ability to join a larger circle of communities?

The obvious answer is that an automated agent shouldn’t be commenting on other people’s web log entries, unless it is invited. That invitation must be extended by a human. Then if someone else finds the results of that agent useful, they can subscribe to the agent generated web log. A human with a stable identity can advertise for an agent, but an agent with an ambiguous identity should not be advertising for a human.


links to this post (0) comments

Maybe the Best Use for Web Logging Is to Teach Us More About Ourselves 

I, Cringely
Joe thinks web logging will become the way we keep track of our lives. We'll keep our pictures, our thoughts, our schedules, even our work output all set in digital form against a web timeline. Where Joe goes beyond a lot of other thinkers in this space is in his desire to use web logging for more than just keeping track of stuff. Joe hopes to pioneer what essentially comes down to personal data mining.
. . . But Joe Reger wants us to not think so much about the web log publishing model and instead use the technology -- preferably HIS technology -- as a personal freeform database with analytical tools to take the measure of our own lives. Here we've been thinking about web logs as a way of reaching out to the world when they may be as much or even more useful reaching into ourselves.

links to this post (0) comments

Sunday, June 13, 2004

People-oriented automation 

Loosely Coupled weblog

. . .

The trouble with traditional enterprise software is that it's rooted in an organizational model that assumes a large bureaucracy shuffling documents around according to preset procedures. Whereas 21st-century business is carried out by delegating decision-making responsibility as far down the reporting line as possible.

. . .

So why not cut out the intermediary and let the user modify the business automation directly? The traditional response is to say that users aren't software experts, and of course that's a valid argument if the only way to change the automation is by recoding the software.

. . .

Nsite describes its services as "business automation on demand." Users can set up forms-based processes such as approval procedures, task management or change control, start using them, and then analyze performance and make changes to fine-tune each process. The service is hosted online, so users don't have to touch the underlying software, and the cost starts from $40 a month for a process with up to 10 users. This hosted service approach means it's possible to get started within days and see a return on the (very small) investment within the first month. It's a very neat way of bypassing the software bottleneck for automating the myriad of small but often troublesome people-oriented processes in a business, and handing control directly to users (but at the same time being able to track the processes that have been set up and also monitor how they're performing).

links to this post (0) comments

Saturday, June 12, 2004

HowTo: Create a SQL Server 2000 Reporting Services Report from a Stored Procedure 

Glen Gordon
The answer lies on the Data tab. When you go to that tab after the wizard has run, you'll see your EXEC statement. You want to change the query type from a text command (as is represented by the presence of the word EXEC) to a stored procedure. You do this by clicking on the button to switch to the Generic Query Designer view. Once there, you'll see a drop down on the right that says Command Type . It will say Text. Change it to StoredProcedure and remove the word EXEC and any hard-coded parameter values following the SP name. Now, if you click the Execute (!) button, you should see a window prompting you to enter parameter values. Try a few values in here to make sure it works.

links to this post (0) comments

The meaning of OLAP? 

An analysis of what the increasingly misused OLAP term is supposed to mean

The Codd rules also turned out to be an unsuitable way of detecting ‘OLAP compliance’, so we were forced to create our own definition. It had to be simple, memorable and product-independent, and the resulting definition is the ‘FASMI’ test. The key thing that all OLAP products have in common is multidimensionality, but that is not the only requirement for an OLAP product.

The FASMI test

We wanted to define the characteristics of an OLAP application in a specific way, without dictating how it should be implemented. As our research has shown, there are many ways of implementing OLAP compliant applications, and no single piece of technology should be officially required, or even recommended. Of course, we have studied the technologies used in commercial OLAP products and this report provides many such details. We have suggested in which circumstances one approach or another might be preferred, and have also identified areas where we feel that all the products currently fall short of what we regard as a technology ideal.

Our definition is designed to be short and easy to remember — 12 rules or 18 features are far too many for most people to carry in their heads; we are pleased that we were able to summarize the OLAP definition in just five key words: Fast Analysis of Shared Multidimensional Information — or, FASMI for short.


links to this post (0) comments

Does search need metadata (schemas)? 

Peter Bailey

Don't get me wrong, I do think there is a role for metadata. It's great for record-keeping purposes. Say I want to find all articles authored by a certain person, or created in a particular year. In these circumstances, accurate metadata is essential.

Such search activities are measured in the information retrieval community by the recall metric (which counts how many relevant documents - of the entire set of relevant documents - have been retrieved when some number of documents overall have been retrieved).

In a modern context however, assuming the existence of a content management system such as Sytadel, this activity is much better left to the CMS: firstly in assigning the metadata accurately and secondly in carrying out the retrieval activity (which is typically just a straightforward database query).

But in terms of improving your users' ability to search for documents in the way the people expect to search these days (that is, by issuing two or three query terms to the search engine and getting back a relevant set of results in a couple of seconds), metadata is completely irrelevant (excuse the pun).


For more reading on this issue, read Cory Doctorow's delightful article Metacrap - Putting the torch to seven straw men of the meta-utopia.

For more reading on the fundamentals of search, see Tim Bray's series On Search. (Tim takes the broad view of metadata, not the narrow schema-based view I discuss here. I also think he ascribes too much weight to Google's PageRank value as a significant component of Google's result ranking algorithm, but that's another story.)

I'm indebted to David Hawking for discussions over several years on the subject of metadata and search. I'm also looking forward to an upcoming study from him and Justin Zobel that he mentioned to me yesterday, which sets about objectively measuring the effectiveness of metadata-based search versus non-metadata based search in an enterprise setting with extensive metadata.


links to this post (0) comments

Wednesday, June 02, 2004

Design Eye for the Usability Guy 

Her is a link to the meme that is rocking the Blogsphere. A must read for all Web designers.

Further, in the spirit of sharing, I decided I’d gather up some knowledgeable designers and help Nielsen in return with a little bit of design advice. The Design Fab Five, right here, right now.

You know you’re going to love this.


links to this post (0) comments

This page is powered by Blogger. Isn't yours?