On any but the most trivial Visual Basic project, some form of version control must be in place. We'll argue that the larger and more complex the Visual Basic projects you undertake, and the larger the teams you have working on the projects, the larger the benefits from implementing an automated version control system. Based on our experience, I'd say that version control is one of the most under-invested areas of Visual Basic development: it's far from the most inherently interesting and generally doesn't get paid the attention it deserves. There are now a number of excellent version control options specifically suited to the Visual Basic environment; adopting and rigorously implementing one of them could provide one of the largest time-savings available. In this section, we'll focus on the unique features of the Visual Basic project environment that dictate the type of automated version control solution you should implement.
First, in order to run and test a Visual Basic application, every developer effectively needs every file contained in the .MAK. So version control systems that work with a strict file-at-a-time check-in check-out metaphor are not well suited to Visual Basic projects. Rather, Visual Basic requires version control that keeps all files within a project synchronized both in a shared project directory and in each developer's personal work directory.
Next, in a Visual Basic project it's often impractical to assign strict ownership of one developer for each separate file in the project. Given this, it's important to use version control software that contains a compare/merge capability plus complete historical tracking of all project files and the ability to roll back to any previously saved version.
Let's examine some common scenarios that illustrate the need for version control and point out some of the features you'll want to look for in deciding on what product to use.
Even if you aren't working in a team environment, you could still benefit from good version control software. For example, how many times have you worked for a number of hours on a new feature, only to find that it would take more time to fix the old features you broke in the process than it would simply to start from scratch and do the new feature right? A good version control package solves this problem easily: simply roll all changes back to where you were before you started on the new feature. Many of the packages support complete commented history of all files making up a Visual Basic project, with rollbacks available for any combination of files or the entire project.
Things get considerably more complicated in the team development world. As we mentioned above, Visual Basic development doesn't always allow a strict assignment of ownership at the file level. For example, you might need to have two developers making changes at the same time to different functions within one .BAS file. In Visual Basic, a model that works well is to have developers work locally (in their own directories) with their copy of the project, and store the project repository in some shared directory on the network. Then developers can work locally as they normally would, and the version control software tracks any differences between each developer's local version and the repository version of the project. So, in order for the shared-file approach to work, the software must support a compare/merge capability, so that changes made simultaneously to the same files can be compared, side-by-side, as developers attempt to update the repository version of the project.
To test and ultimately debug a project, it's helpful to have clearly defined versioned releases (Alpha 1.0, Alpha 1.1, Beta 1.0, and so on) to test. Good version control software should support this capability.
Even though typical Visual Basic project work won't involved making changes to non-Visual Basic files (such as .DLLs, .VBXs, and so on), it's important to be able to track whether changes have been made to files of this nature. For example, if your development group takes component software seriously enough to undertake internal .DLL/object development, changes might be made to those files unbeknownst to solution builder/Visual Basic developers. Flagging changes of this nature would be a nice-to-have capability of version control software, although not a requirement (yet).
In Figure 1, we show a schematic of a hypothetical network directory structure that supports these concepts.
Figure 1: A network version control structure
In particular, note how this directory structure supports the concept of a global repository, our scenario 5.
One of the best ways to gain efficiencies in large-project work is to develop a body of shared code. This code (variables, constants, procedures, objects, and so forth) should be shared in a directory made available on a read-only basis to all but those developers with responsibility for creating the shared code. Version control software can be used to facilitate this approach, if your software can keep a log of changes to these common objects. If such a repository log can be kept, change reports might even be automatically distributed to all Visual Basic developers in your workgroup. (You also need to implement a team and application model that supports this component builder/solution builder distinction; we'll discuss this aspect more later in the presentation.)
Once a decision is made on version control software, the learning curve to take advantage of it is typically quite short. In our experience, 2-3 days is sufficient for one version control champion to come up to speed on a package and convey its benefits to others in the workgroup. In all but the most trivial projects, automated version control is generally an unmitigated win, and you should make the investment — it will pay back quite quickly. Even though some version control systems seem quite expensive (say, $1,000 per developer), developer time is even more expensive. A 5-developer group that averages $100/hour would need 50 saved hours to pay back a $5,000 investment. Once up the learning curve, we've seen at least that much time saved over the course of a couple of months.
Scenario 5 raises a point important enough in Visual Basic work to deserve a book, let alone a paragraph or two! Since Visual Basic itself can't be used to create DLLs, you have to work a little harder to reap the advantages of common objects and shared code. (Here we're referring to anything that can be re-used across projects — global constants and variables, functions and subs and so on; for brevity's sake, we'll refer to these as common objects.)
Here's an approach we take that has worked well for us.
One Visual Basic project (call it GLOBALS.MAK) exists on a shared network directory. This contains all common objects created by our internal component builders. Depending on your team and network structure, you might want to use security to enforce read-only access by all others; we do not. Individual .BAS and .FRM files exist within this project to contain related procedures or declarations. Here are some examples:
Table 2: Common "objects" in GLOBALS.MAK
File |
Contains |
GLOBALS.BAS |
Generic global procedures (e.g., movers, array sorts) Generic variable and constant declarations (e.g., contents of CONSTANT.TXT) |
DATAVBSQL.BAS |
Generic procedures for conversing with Microsoft SQL Server™ via VBSQL |
DATAODBC.BAS |
Generic procedures for conversing with SQL databases via ODBC |
DATADAO.BAS |
Generic procedures for conversing with databases via DAOs |
WINAPI.BAS |
Windows API procedure and constant declarations |
MDIPRNT.FRM |
Template form for MDI parent |
MDICHILD.FRM |
Template form for MDI child |
CONTROLS.FRM |
Form containing "typical" controls (command buttons, etc.) |
GLOBALS.MAK itself is maintained with version control. The version control package we use supports a view showing historical, commented changes to files within a project; our component builders use this to generate reports on substantive changes. These reports are then distributed to solution builders, who are responsible for deciding whether to include the latest changes or not into their projects.
When a solution builder begins a new project, these files from GLOBALS.MAK are usually copied into his or her project directory, rather than loaded directly from the shared directory containing GLOBALS.MAK. This introduces an element of redundancy, but is normally justified by the greater ease with which developers can work on their projects (off the network, out of the office, and so forth). Remember that each project is version controlled, so that a history of changes is available and rollback is possible.
In turn, each project gets its own .BAS file (e.g., a project for Acme Corp. might get ACME.BAS) in which are stored globals useful for the project. In this way, solution builders are allowed the freedom to create their own common objects, and at the conclusion or breakpoints in the project, solution and component builders discuss the possible graduation of new common objects into GLOBALS.BAS.
When it comes to testing your applications, the interesting questions aren't whether to test or who should do the testing — the answers to these questions should be non-controversial: you need to have project team members other than coders with primary responsibility for writing and performing a complete system test. Here are two considerably more interesting questions that warrant our attention:
Given an understanding of what's involved in writing a test plan, the next question is:
Here's one way to hierarchically organize an entire test for an application or project.
Test Suite Everything: all test cases covering all aspects of a project
Functional Area 1 This is a set of related test cases that test one specific area of a product's functionality |
||
|
Case 1 A case is a set of scenarios that review all aspects of a single feature of an application |
|
|
Scenario 1 A scenario is one specific instance of a single product feature. ... Scenario N |
|
|
... Case N |
|
... Functional Area N |
For example, to design a test suite for the Customer application would involve defining every distinct functional area of the application, and defining what would be involved in a test of those functional areas. Since defining an application's functional areas clearly needs to be performed as part of its functional specification (see below), there's a strong argument here for getting the test team involved in the project's functional specification phase. (Again, we'll discuss this more later in the presentation.)
One convenient approach is to define as a functional area commands that are available from one menu. In the Customer application, we might define one functional area as all of the commands available from the Customer form's Records menu. We show this menu in Figure 2.
Figure 2: Mapping a functional area to a set of menu commands
Next, define a test case as a group of related scenarios to test all aspects of a single feature, such as the Delete command shown above. Try to define 5-10 test scenarios that make up the test case, such as:
Tip: Extreme conditions and boundary analysis. Remember to test how the system behaves when used incorrectly as well as correctly. In our customer example, these extreme conditions might include:
Deleting a customer you should not be allowed to delete
Entering a customer with a duplicate key
Entering a customer with an invalid key
A test plan is a written version of the entire test suite for an application. Done correctly, the test plan is a complete description of all of the testing that would need to be done, and what constitutes the success of any particular test. Test plans should be written as if they were to provide directions on how to test an application to someone other than the author — the primary criterion for a successful test plan is whether someone else could pick it up halfway through the project and continue testing.
Everything we've described up till now could be performed with no automated test software. In fact, most custom applications are tested by a combination of the following methods (assuming a test plan is in place):
Automated testing software allows the automation of an entire test script, which can then be run, in whole or in part, without human intervention. Most automated testing software programs include a recorder to capture keystrokes and generate test scripts, but most also include (and encourage the use of) a full-blown a full-blown language in which to write test scripts. The benefits of automated testing are obvious:
But there are significant costs to automated testing as well:
Figure 3: Automated vs. human testing
Figure 3 is based on a Microsoft white paper, and stresses the up-front test development investment that gets paid back over the course of a number of tests of a large project with a relatively stable user interface. Clearly, the larger the project and the more extensive the testing that could be performed by automated methods, the greater the payback.
Recommendation: Implementing Automated Testing Software
Whereas implementing version control software is almost always a clear win, we have found the range of projects that don't benefit from automated testing to be much larger. Without claiming any sort of precision to the estimate, our guess is that, if you do not already have personnel skilled in automated testing software, projects would have to be in the thousand-hour range plus in order to recoup an investment in automated testing. Despite many of the automated testing packages using some variant of the BASIC language, a steep learning curve still exists to master the art of programming effective testware. Clearly if the required time is in the range of hundreds of hours, this lessens the incentive to implement automated testing on smaller projects.
With the increasing modelessness and user-customizing capabilities of state-of-the-art GUI applications, it's becoming impossible, in a literal sense, to completely test an application. That is, it's not possible to reproduce in a test every separate activity a user might perform when using the software. For example, suppose you created an MDI application with 10 (modeless) child windows and 5 (modal) report dialogs with 3 distinct reports each. Since it might matter how you get to each individual report, to fully test this scenario you'd have to test at least 54,432,000 (10!*5*3) different ways to generate a report — clearly a workout for even the best testing software. And if you tried to test for different things that could be going on in the MDI child windows when the report was generated, you'd really have a lot of testing to do! Besides just throwing up your hands at the impossibility of testing a GUI applications, there are several points suggested by examples like this: