VS 6.0 Benchmarks: New Features Don't Impact Speed

Visual Studio adds a host of new features to each of its tools while maintaining comparable performance

by Ash Rofail and Yasser Shohoud

Reprinted with permission from Visual Basic Programmer's Journal, Dec 1998, Volume 8, Issue 14, Copyright 1998, Fawcette Technical Publications, Palo Alto, CA, USA. To subscribe, call 1-800-848-5523, 650-833-7100, visit www.vbpj.com, or visit The Development Exchange.

VBPJ's benchmarks for Visual Studio 6.0 (VS6) reveal much more than the general performance of the tools that comprise it; they reveal where Microsoft is investing its resources, and whether Microsoft is translating its promises into action. For example, the benchmarks show that ActiveX server component performance increased an average of 40 percent throughout the tools in the suite—an indication that Microsoft is following through on its promises about COM.

What you need:
To view the source code, you need Visual Studio 1.0 and Visual Studio 6.0.

Notwithstanding the performance gains of ActiveX server components, the tools that comprise Visual Studio don't make significant performance gains over their Visual Studio 1.0 versions. For example, Visual Basic (VB) and Visual J++ (VJ++) both post slower scores in several categories of benchmarks, and Visual C++ 6.0 (VC6) posts scores largely comparable to those of VC5 (see Figures 1 and 2). This is not necessarily a negative because all the tools add an impressive array of significant new technologies and features. Incorporating this new functionality without incurring a significant performance hit is impressive. Some might question that quality or inclusion of some of these technologies and features, but that's a different issue, and probably worth an article by itself.

The benchmarks in this issue serve three purposes. First, looking carefully at the scores— whether they go up or down—can give you a good indication of what has changed in the tool. A score that goes down isn't necessarily a bad thing; it might mean you're trading a small amount of speed for significantly enhanced capabilities. Second, the benchmarks can serve as a guide for choosing the right tool for a given task. The relative difference in speed versus ease-of-use or other capabilities is often an important factor in choosing a tool. Third, the benchmarks can help you gauge how well a given tool is coming along. For example, the fact that VJ++ packs such an extraordinary amount of new features while maintaining its performance shows Microsoft's commitment to the tool.

Similarly, running benchmarks against an entire suite of tools provides a good indication of Microsoft's development tools direction as a whole. Visual Studio 1.0 was about performance [Ward Hitt, David Cooke, and Stephen R. Davis, "Benchmarking VB5 vs. VC++, Java, and VB3/4," VBPJ May 1997 and Phil Weber, "1997 Database Benchmarks," VBPJ May 1997]. Simply recompiling your C++ app in VC++ 5.0 resulted in a significant speed boost, and VB5 picked up a long-wished-for compiler.

Visual Studio 6.0, on the other hand, is entirely about features—especially Internet and database features. Its goal is to help developers do their jobs better and more easily. As VB's success demonstrates, enhanced productivity can prove more important than raw speed in the developer's everyday life. Visual Studio incorporates ActiveX Data Objects (ADO) and various Internet technologies into its tools, including WebClasses for VB and an improved version of Active Template Library (ATL) for VC++.

Keep this in mind as you look at the results: The numbers themselves show only how different tools compare to one another. For example, the benchmarks indicate VJ++ 6.0 is 15 percent slower than VB6 at math, but in real-world terms, this isn't enough to make a practical difference. Similarly, VC++ performed the math suite about 33 percent faster than VB, but chances are this result isn't going to affect which tool you choose to write your math classes. Rather, you will ask yourself whether you need to perform complex math operations that require high-precision factors or basic accounting calculations. The answer to this question usually influences your tool decisions far more than raw speed.

 
Figure 3 DAO Isn't Dead Yet

Set Up the Tests
The tests fall into one of two categories. The first category focuses on nondatabase operations across most of the Visual Studio tools: VB5, VB6, VC++ 5.0, VC++ 6.0, VJ++ 1.0, and VJ++ 6.0. The second category targets a variety of data access methods in VB5 and VB6 (see Figure 3). The database access benchmark tests use only VB5 and VB6 because the objective is to measure data access technologies, rather than the speed of the tools themselves. The database benchmarks are further divided into two more categories: local database access and remote database access.

The test categories for the VS benchmarks include math operation; string operation and manipulation; graphics operation (two native, one API); standard form-load operation with intrinsic control types; loading ActiveX control forms and Explorer-style forms; and ActiveX Server invocation tests. The Visual Studio benchmarks include two additional tests. The first measures string operation in VC++ 5 and VC++ 6. Visual C++ now lets you perform string functions using Microsoft Foundation Classes (MFC) CString classes or C strings, and the differences in string performance might make a difference for a given approach. The second additional test covers a new type of form in VJ++ 6.0, the Windows Foundation Classes (WFC) form, and measures how WFC forms compare to other types of forms in the Visual Studio family.

Let's take a closer look at the tests themselves. The two versions of VC++ returned the fastest times in the math operations test, followed by both versions of VJ++, then VB5 and VB6. VB6 turned in the slowest time, about 33 percent off VC++ 6.0's time. You can get a good sense of the methods used to conduct the tests by looking at the individual test beds (see the sidebar, "How We Conducted the Tests"). The math test assigns values to variables and uses these variables to solve a quadratic equation: (2.31x^2 + 3.67x - 150.901399 = 0). The test then gets a log value and performs trigonometric functions. The operation executes 10,000 times, and the benchmark calculates the average time of 10 tests.

Similarly, VC++'s Null-terminated strings turned in the fastest times on the string test, although its MFC string classes recorded noticeably slower times. The string test assigns values to strings, concatenates them, compares a pair of strings, and searches for a substring within a string. The string test performs this routine 10,000 times, and the result is the average of 10 tests. Visual Basic's times were considerably slower on this test; VC++'s Null-terminated strings performed the test more than five times faster than VB6, which was marginally slower than VB5. The MFC string classes were twice as fast as VB6. Interestingly, VJ++ 1.1 performed 10 percent more slowly than VB6, but VJ++ 6.0 was 35 percent slower than VB6.

Did VB6's Graphics Slow Down?
The API graphics test brought results you might expect. VC++ edged out VB, but none of the tools showed much difference in performance compared to its previous version. The native graphics tests initially produced some surprising results, however. Visual Basic's times were crunched by VC++. More importantly, VB6 returned results more than 3.5 times slower than those of VB5. This result seemed strange, so we double-checked the code, which uses the PSet method to draw pictures. We ran the tests multiple times, coming up with consistent results. Ward Hitt, who oversaw and conducted the bulk of the benchmarking tests for Visual Studio 1.0, served as technical reviewer for these tests. He also ran the native graphics test and garnered the same results. So, he went a step further and tested the Line method. This time VB6 posted results similar to those of VB5. The native graphics slowdown appears to occur only with the PSet method. Ward's results prompted VBPJ to rerun its native graphics test using the Line method; VB6's performance on this test is comparable to that of VB5.

Line and PSet are two different methods for drawing on an object. Most experienced VB developers bypass the PSet method entirely, going straight to the API to create this kind of functionality. This might be one reason the slowdown slipped through. The VB6 beta testers probably didn't use the PSet method. You might want to consider using the API to create this kind functionality if you need it. You can also find the source for the benchmarking tests on The Development Exchange; the API graphics test uses the SetPixel API call to draw points. If you intend to use the PSet method, however, you should weigh this speed difference carefully. The reason PSet slowed down in VB6 is unclear, but interesting. VBPJ will examine this anomaly in more depth in a future article.

The form-loading test results weren't as unexpected as the graphics test results, but VJ++'s showing was less than spectacular. These tests evaluate three kinds of forms you can load: intrinsic control forms, ActiveX control forms, and VJ++'s new WFC control forms. It might not be practical to use only intrinsic controls in your UI, but the tests show that this approach gives you the fastest-loading forms. Visual C++ again rang up the fastest times; VC6 performed the tests about 88 percent faster than VB. Visual Basic 6.0 turned in slightly faster times than VB5, but VJ++ performed the tests about 20 times more slowly than VB6. Take a careful look at these numbers if you are considering developing your application UI in VJ++. Even VJ++ 6.0's new WFC forms are eight times slower than VB6's forms.

Note: Evaluate this test result carefully. Is a form loading 20 times more slowly something that should discourage you from using VJ++ to build a UI, or is its real-world loading time of 0.37 seconds simply not worth bothering about? As with most tests, the significance of this result depends on the kinds of apps you build. In many cases, 0.37 seconds the first time you load a form is inconsequential; in other cases, it might make the difference between an app that seems acceptably fast and intolerably slow. Once upon a time, all tools seemed more or less equal, and performance played the critical, deciding factor. Now, each tool has a specific vision, and you need to evaluate tools primarily on the technology and functionality they provide rather than raw performance.

Of the remaining form types, the only result of note is that VB posted the highest score for loading ActiveX control forms—another good sign that Microsoft is focusing on ActiveX and COM.

DB Tests Offer Countless Options
Database benchmarks prove to be among the hardest to perform because you can never present an ultimate test that covers all the possible combinations of settings and operations. If you work with SQL Server, you can get different performance results based on the type of operation you perform. These results can vary, depending on whether you perform an insert, update, delete, or even a select. The combination of different types of data access techniques gives you an almost unlimited number of choices and possible performance levels.

We attempted to keep things simple here, relying on all the default settings. We also followed the methodology that the tools should be measured the way they come out of the box, and that any optimization done should be reflected on all technologies equally. We realize in some cases these optimization options might not be available on other methodologies, but that's what differentiates the methodologies from one another. In practice, the flexibility of the engine and methodology should drive your choice of data access, rather than the performance numbers alone. You should also invest in the engine and methodology that will scale better and stay around for a while (ADO, we hope). Finally, you should conduct your tests with a variety of different operations, such as bulk updates, inserts, and deletes.

The database tests comprise two categories: local access and remote access. We limited the tests to VB only because we wanted to measure the raw performance of the data access methods, rather than the speed of the individual tools. The test suite for VB5 and VB6 includes tests for a variety of data access methods (see Table 1).

We performed the local data access tests using an 8,181-record Microsoft Access database provided by Roger Jennings. We performed the remote tests using a Microsoft SQL Server system table with 2,960 records. We could've chosen from countless possible combinations and access methods. In this case, we accepted only the default database settings, incorporating some changes to the type of cursors used. We strongly recommend trying additional configurations related to data pagination and caching, but it would be overwhelming to cover all the possible configurations and combinations in this space.

We expected Data Access Objects (DAO) to finish first when we performed the initial local database access tests, so we were surprised when Remote Data Objects (RDO) with a server-side cursor beat DAO's times. This result prompted us to investigate further, and we discovered that the DAO recordset we were opening was a dynaset (the default), while ADO and RDO opened ForwardOnly cursors. In other words, the tests compared apples and oranges. So we went back and made the necessary adjustments: to change the DAO recordset type from dynaset to ForwardOnly. This time, DAO recorded the fastest times. DAO is probably faster because it uses the Jet engine to retrieve an MDB file; RDO and ADO go through the ODBC driver, which in turn uses the Jet engine. RDO and ADO must go through an extra layer to retrieve the data.

The database access tests we performed open a recordset using a simple select query and traverse the recordset (in one direction) using a field to populate a listbox. This is by no means the only way to benchmark a data access method, but it does give you a good sense of the performance of the most commonly used operations.

DAO: Not Dead Yet
DAO's victory in the local database access tests is interesting, given Microsoft's push for developers to use ADO for all types of data access (see Table 2). The results of these tests also indicate ADO still has room for improvement in terms of local connectivity.

That said, exercise caution about making broad generalizations on the basis of these tests. For example, you often see developers promoting a specific type of data access methodology based on their previous experiences. These developers are often surprised to see technologies they thought were fast and reliable perform not as well as expected. RDO provides a good example of this.

RDO with server-side cursors placed second in the local database access test, but posted significantly less efficient scores in the remote access benchmarks. There is a logical explanation for this: We used the MS Access ODBC Driver 3.51 for the local access with RDO, but we used the MS SQL Server ODBC driver 3.60 for the remote access. Obviously, comparing two drivers that cannot be used interchangeably is comparing apples to oranges. Note another significant difference: When you use server-side cursors, the database engine takes charge of managing the recordset cursors. Jet is the database engine responsible for managing the recordset cursors when you use server-side cursors locally; MS SQL Server 7.0 (beta) manages the cursors remotely. Again, you are comparing apples and oranges. Take care when trying to draw conclusions from this. However, it is significant that RDO with server-side cursors didn't perform as well as other technologies you can use for remote data access. The reasons for its less efficient performance might involve the drivers and server-side engines more than RDO, however, and any change in these drivers or server-side cursors will probably cause a similar change in RDO's performance.

Note that the database benchmarks utilize select statements only. Other types of operations might have produced different results. You should treat these results only as a place to start. Also, be aware that the results you see in this article might differ from your own. So many factors affect database benchmarking that you should never accept someone else's results, including ours, at face value. Instead, use them as a starting point for conducting tests particular to your own, unique situation. Sometimes the only significant fact a benchmark points out is the need to collect more benchmarks. Before you modify the tests for this article or design your own tests, look at some benchmarks performed on Visual Studio by NSTL Inc. You can find the results of these tests on Microsoft's Web site at this URL: http://msdn.microsoft.com/vbasic/prodinfo/benchmarks/.

Regardless of how much data you collect, always remember that speed isn't everything. Which technology to use should probably depend more on factors related to the types of operation you need to perform, the database schema, and indexing strategies.


Ash Rofail is a principal engineer and architect at Best Software. He is a frequent contributor to VBPJ, and specializes in VB, C++, SQL Server, and Java. Ash is the author of Building N-Tier Applications with COM and Visual Basic 6.0 from John Wiley. Reach him at Ash_Rofail@bestsoftware.com.

Yasser Shohoud is a senior software engineer at Best Software. He is responsible for building enterprise intranet solutions using Visual Basic, Visual InterDev, Visual C++, and SQL Server. Yasser also teaches Microsoft Official Curriculum courses and is a Microsoft Certified Trainer. Reach Yasser at Yasser_Shohoud@bestsoftware.com.