Dream of Code Reuse

As we've said before, the promise of code reuse is somewhat realized in reuse of class libraries within in-house small projects. In the larger 'shrink wrap' software market, the postulated 'uninhibited world-wide' market for class libraries never materialized. It turns out that class libraries delivered without source code are very hard to use and debug. Delivering class libraries with source code still gives the seriously business-minded software vendor heartaches because of the potential for unauthorized plagiarism.

While the leading edge fraction of the industry are trying various 'silver bullet' object-oriented solutions, and failing, various standards organization and industry groups (i.e. OMG, DCE, Open Group, the COM camp within Microsoft) are taking a step-by-step approach by evolving the current operating system and supporting services towards an object-oriented style of computing. The current result from these working groups is a computing model based on distributed objects, or components. Almost all of the abstract concepts preached by the object-oriented evangelists are deployed in creating reusable software components that can interoperate over heterogeneous networks of computers.

Regardless of what architectural appeal a new methodology may have, the business community will not usually commit to it wholesale unless there exists compelling business reasons for them to do so. The overwhelming driving force towards a component model of computing is simple economics. In the ideal world of component based applications building, pieces of prefabricated and tested code can be assembled together to create applications. The same pieces of code can be reused in other contexts to build alternative applications. In this world, software components only have to be written once, drastically saving the time and effort required for 'reinventing the wheel' for each individual project. If it sounds familiar, this is the exact same promise almost a decade back that had brought the object-oriented design and development into vogue. What, then, makes it different ten years later?

The key here is the slow emergence and development of a simple yet effective supporting infrastructure that can actually make it work. One term that is often used to describe these 'piece of functionality' components is a Software IC or Software Integrated Circuit. The analogy drawn upon here is the silicon chip in hardware design that encapsulates a fixed piece of functionality and works with other chips to create functional electronic devices. Take the example of a microprocessor chip. By itself, it's not very interesting, and it can do very little for the end-user. However, when it's put into a design where it can communicate via a bus to memory chips and input/output chips, we end up with a highly functional computer system. It's this 'bus' which has been missing for our 10-year-old Software ICs. In the hardware example, the electronic bus interconnecting the microprocessor, memory, and input/output chips has well defined electrical and timing specifications, and documented protocols for how the connected devices will be interoperating. In our software example, COM and DCOM (and also CORBA, dominant in the UNIX world) provides such a bus for object cooperation.

© 1997 by Wrox Press. All rights reserved.