Dale Smith
Microsoft Developer Network
September 1998
Summary: Explains the process of removing the business logic from an application and replacing it with calls to a Component Object Model (COM) component. (6 printed pages) Discusses:
In Phase 2 of the Duwamish Books sample, we removed all the data access code from the individual applications and placed it into a single shared component. (See my article, "Breaking Out the Data Access Layer.") Phase 3 of the Duwamish Books sample breaks out the business logic layer into a single shared component. See Steve Kirk's article "Abstracting Business Transactions" for a discussion of the design of the business logic component.
This article describes the process of replacing the embedded business logic in each desktop application with calls to methods in a separate COM component—the Duwamish Books Business Logic Layer (DBBLL).
In Phase 2 we moved the data access functions from the applications into a COM component. We used a similar process in Phase 3 to move the business logic functions to a new COM component that is shared by all of the applications. Initially we determined a set of procedures that would encapsulate the general business rules to be used in common. These generic methods, like GetEmployers and UpdateAuthors, formed the baseline for the COM component. As we worked through various issues in the clients, we refined the procedures in the COM component to more accurately reflect the particular needs of each of the clients.
In previous phases the rules regarding what was done with data were implemented as procedures in the client code. In Phase 2.5 the database changed from a Microsoft® Access database to a Microsoft SQL Server™ database. This change provided us the opportunity to move most of the business logic to the server side as methods in the new component and as stored procedures in the database. In the client code in Phase 2 many of the procedures included a number of processes. In Phase 3 the same procedures call a single method in the business logic component that encapsulates all of those processes previously handled on the client. An example of this is found in the Shipping and Receiving application. Previously, to process a number of items for a particular order, the code in the application updated the Inventory table and added a new InventoryTrack record for each item. In Phase 3 the hierarchical "order" recordset is updated and passed to the UpdateOrder method in the DBBLL. This one method updates the Inventory table and adds an InventoryTrack record for each item changed in the order recordset.
As we move Duwamish Books onto a network, even though data is transferred reasonably quickly, it makes sense to cache some of the data on the client. To illustrate the reasoning behind caching data rather than direct access, let's look at the search function in the catalog application. This function accesses data that represents the items in the current database, which initially has only 530 records. Let's imagine that there are a few hundred clients that are trying to do a search and that data access is accomplished in real time from the server. As each client accesses the database, all the data from the resultset determined by the search criteria is sent across the wire to the client. The number of clients doing a search at the same time multiplies this. Can you imagine the time it would take for that hourglass cursor to return to the default cursor in a circumstance like this? Snooze time!
I'll describe the process we used to make data readily available on the client for quick searches instead of accessing data in real time directly from the database. When an application is launched, the user can tolerate some amount of delay before the application is fully functional. This is a much better time to introduce a delay rather than every time someone is trying to do a search. Let's look at the same data and the search function, as in the first example. As the application is launched, the item data is preloaded into memory as an ADODB recordset and remains until the user ends the application. Now each time a search is performed, the resultset is returned from the persisted data rather than directly from the database. This enhances performance because the data is in memory on the client rather than on the server. Of course, there are limitations to this methodology. The most blatant is if there are limited memory resources on the client because all records are cached. Here we are dealing with a relatively small recordset, so this isn't a major problem. With larger recordsets the problem of memory usage doesn't become a real issue unless many applications are running at the same time and/or memory is limited on the client computer. Another problem is that the persisted data does not "see" updates. Any updates to the database will only be "seen" by the client when the client is restarted and/or at predetermined intervals by the client. In our applications we implemented only the former, which just recognizes changes to the database when the application is started.
Business logic can be divided into two distinct categories. Those processes that are generic and support the general data handling requirements of the application make up the business logic. Those processes that group together functions that make up specific business processes comprise the workflow. An example of business logic is the GetSuppliers method in the Duwamish Books Business Logic Layer (DBBLL). This generic method is used in the workflow method GetDomainObject with the argument eDomains = icDOMAIN_SUPPLIERS
to return a specific supplier. The following code sample taken from the workflow class illustrates this:
Public Function GetDomainObject(ByVal eDomains As DOMAINS, _
oGenericObject As Object, _
PKId As Long, _
Optional StoreId As Long = 0, _
Optional ItemID As Long = 0) As Boolean
Dim iCount As Integer
GetDomainObject = False
Select Case eDomains
.
.
.
Case icDOMAIN_SUPPLIERS
Set oGenericObject = New cSupplier
If Suppliers.RecordCount Then
Suppliers.MoveFirst
Suppliers.Find scFIND_PKID & PKId
If Not Suppliers.EOF Then
With oGenericObject
If Not IsNull(Suppliers!PKId) Then
oGenericObject.PKId = Suppliers!PKId
If Not IsNull(Suppliers!Name) Then
oGenericObject.Name = Suppliers!Name
If Not IsNull(Suppliers!Address1) Then
oGenericObject.Address1 = Suppliers!Address1
If Not IsNull(Suppliers!Address2) Then
oGenericObject.Address2 = Suppliers!Address2
If Not IsNull(Suppliers!City) Then
oGenericObject.City = Suppliers!City
If Not IsNull(Suppliers!ContactPerson) Then
oGenericObject.ContactPerson = Suppliers!ContactPerson
If Not IsNull(Suppliers!PhoneNumber) Then
oGenericObject.PhoneNumber = Suppliers!PhoneNumber
If Not IsNull(Suppliers!State) Then
oGenericObject.State = Suppliers!State
If Not IsNull(Suppliers!ZipCode) Then
oGenericObject.ZipCode = Suppliers!ZipCode
End With
GetDomainObject = True
End If
End If
.
.
.
End Select
End Function
As you look at the diagram in Robert Coleridge's article "An Introduction to the Duwamish Books Sample," you will notice that one of the goals in Phase 3 is to make the workflow separate and distinct, but still part of the client application. In the client applications we separated some of the distinct procedures related to workflow into a class module. The obvious conclusion that one can come to is that this class will be used in Phase 4 as the basis for the separated workflow component.
In each Phase 3 application we added a code module that serves as a mediator between the workflow layer and the user interface (UI), which we call the presentation module. This module contains the procedures that provide the interface between the displayed controls on forms and the workflow layer. For example, the DisplaySupplier method in the Shipping and Receiving application provides the input for the text boxes on the form named frmMain from the object returned by the GetDomainObject, in the workflow class, with the argument eDomains = icDOMAIN_SUPPLIERS
.
In Phase 3 we use Microsoft ActiveX® Data Objects (ADO) 2.0, which wasn't previously available. A new feature of ADO is the hierarchical cursor that is provided by the Shape language built into the ADO client cursor engine. In the Shipping and Receiving application a field from the "shaped" recordset rsOrders, containing a child recordset, is used to display the items associated with a particular order. The Set rsDetails = rsOrders("Details").Value
statement in the following code segment illustrates the use of the child recordset:
Public Sub displayDetails(PKId As Long, oForm As Form, iMode As Integer)
Dim iItem As ListItem
Dim bBook As Boolean
Dim oItem As Object
Dim rsOrders As ADODB.Recordset
Dim rsDetails As ADODB.Recordset
.
.
.
If moWorkFlow.GetOpenOrders(rsOrders, iMode, StoreId, True, PKId) Then
If Not rsOrders.EOF Then
.
.
.
Set rsDetails = rsOrders("Details").Value
If rsDetails.RecordCount Then
rsDetails.MoveFirst
While Not rsDetails.EOF
If GetItem(rsDetails!ItemID, bBook, oItem) Then
Set iItem = oForm.lstPODetail.ListItems.Add(, , oItem.PKId)
If bBook Then
iItem.SubItems(1) = oItem.Title
Else
iItem.SubItems(1) = oItem.Description
End If
iItem.SubItems(2) = rsDetails!Qty_Ordered
If Not IsNull(rsDetails!Qty_Received) Then
iItem.SubItems(3) = rsDetails!Qty_Received
End If
If Not IsNull(rsDetails!Notes) Then
iItem.SubItems(5) = rsDetails!Notes
End If
iItem.SubItems(4) = 0
iItem.SubItems(6) = rsDetails!PKId
End If
rsDetails.MoveNext
Wend
End If
End If
End If
.
.
.
Set rsDetails = Nothing
Set rsOrders = Nothing
End Sub
Separating the procedures from the form module and placing them in a code module is good practice. All of the procedures in a form are loaded into memory when the form is loaded at run time. Microsoft Visual Basic® loads a module into memory only when one of the procedures in that module is called.
As we continued down this path toward component distribution and scalability, we made the applications more modular in design. All the code was contained in each individual application in Phase 1. In Phase 2 the data access portion was removed from the applications and placed in a COM component. More code was removed in Phase 3 and placed in a new COM component. The net result is that the client code is easier to maintain because it is more structured. Also, with external components maintenance becomes easier because minor changes affecting all of the applications can be made in the component rather than making changes in each of the applications.