This article may contain URLs that were valid when originally published, but now link to sites or pages that no longer exist. To maintain the flow of the article, we've left these URLs in the text, but disabled the links.
|
|
Ken Spencer |
High-performance Visual Basic Apps |
This month I want to take a look at how Scott Stanfield and the folks at Vertigo Software (http://www.vertigosoftware.com) created an application that shows how DNa applications scale up. Microsoft demonstrated Vertigo's app at TechEd this past May. For the record, the National Software Testing Laboratories tested this application, which was certified at 7,500
simultaneous users. That's not 7,500 users in a day; it's
7,500 at the same time. No matter how you figure it, that's pretty impressive! Application Configuration Let's take a look at this from several perspectives. First, the server configuration for any software application is important. The app shown at TechEd used four multiprocessor Compaq Proliant systems that hosted Microsoft® Internet Information Server (IIS) and the demo's business objects. The database server was SQL Server 7.0, running on another multiprocessor Compaq. The Microsoft Windows® Load Balancing Service (WLBS) balanced users across the four application servers. The overall hardware and server configuration were planned to scale upward as the number of users increased. If the hardware couldn't meet new demands, new servers could be added while the applications were running, and these servers could pick up a share of the load. This added great flexibility to the server farm. Significant gains in performance can be made if your database has the proper primary keys and indexes. SQL Server 7.0 improves upon this by providing auto-tuning features that tweak the database over time. During the development cycle, the developers noticed a big performance improvement after exercising the application for two days. Exercising the application provided SQL Server with data on how the applications would be used, then SQL Server tweaked the database configuration to match the load. This feature makes it important to use the database for a period of time with normal usage patterns in order for it to do all the tuning it can. Once the database has settled down with its tuning, you can go to work on the application. Now, let's look at the application. How did they make this program perform so well? The answer is relatively simple. First, they constructed the sample application, then began to test it on a database with a realistic number of records. They created a sample database with one million records and a sample application to exercise it. They used testing software from RSW (http://rswsoftware.com) to test the application under loads using specific numbers of users. The application was built using Visual InterDev®, ASP, HTML, JScript®, ActiveX Data Objects (ADO), Visual Basic, and Microsoft Transaction Server (MTS), also part of Windows NT Server. The user interface employs HTML and JScript to keep performance up and to allow the application to be used with nearly any browser. The server part of the application uses ASP and Visual Basic. Visual Basic was used to create the database and middle-tier components, which were hosted by MTS. The backend uses SQL Server 7.0 stored procedures. So far, the application's configuration is pretty standard and does not really uncover the secrets to the application's performance, but it is a good start in seeing how the application is built. Strategies Contrary to what some developers think, creating components with Visual Basic and hosting them in MTS is one of the best ways to create high-performance applications. The compiled components execute faster than interpreted ASP code. Even if you put the ADO code in the components, your code runs faster because in Visual Basic you can explicitly type the ADO references, which early binds the ADO objects and boosts performance. As I mentioned, the user interface is composed of HTML and JScript. The user interface is very lightweight to achieve high performance. As a result, no DTCs or any intensive graphics are used. The application is also broken into separate pages, with each page performing a single task. As a result, the application downloads and displays very quickly. Figure 1 shows the login page (default.htm) and demonstrates the application's clean and easy-to-use interface. |
Figure 1: Login Page for Demo App |
The application also uses HTML pages wherever possible. ASP files are only used where scripting logic is required to call components or process anything. This cuts down on the ASP overhead to process user requests for files. The script mappings for IIS were also changed by deleting
all entries except the one for ASP files (see Figure 2). You can access this page from the Configuration button on a virtual directories property page in the Internet Server Manager. |
Figure 2: Setting Script Mappings |
There were also numerous other IIS settings that the developers made to tweak the performance of IIS. These changes are documented in the white papers and other documents on the Microsoft Scalability Center section of MSDN Online at http://msdn.microsoft.com/vstudio/downloads/scale/default.asp. The Nitty Gritty So far you have lightweight HTML pages and some ASP. Getting to this point was pretty easyany developer could figure out how to slim down parts of the application and make it faster. It turns out that there are many performance-related lessons to be learned from this example. The default.htm page calls Loginaction.asp, which executes the code shown in Figure 3. This code verifies the user's login with the VerifyLogin method. If the user's login is successful, then Response.Cookies writes the login ID to the browser as a cookie, then the user is redirected to home.htm. The Accounts table contains one million account records, so the query must be quick to process the user ID. The VerifyLogin method is in the business layer (FMStocks_Bus.vbp). It calls the VerifyUser method in the database layer (FMStocks_DB.vbp), which uses the following Account_VerifyLogin stored procedure: |
|
This stored procedure has one small secret in it. The last line returns the number of rows in the recordset. If the return value comes back 0, the login was invalid; if the row count is 1, the login is good. The recordset returned from the Select statement is not returned to ADO.
The Visual Basic code that executes this stored procedure is from the FMStocks_DB.vbp project (Account.cls), and is shown in Figure 4. The Execute method of the Command object calls the Account_VerifyLogin stored procedure. The adExecuteNoRecords option is used to let ADO not return a recordset. This option was introduced with ADO 2.0 and can drastically improve the performance of a query that expects a single result, such as the number of rows returned, a customer number, a calculation, and so forth. Recordsets are flexible, but a lot of overhead is associated with returning them. If you look at ADO objects with Visual Studio® Analyzer, you will see that marshaling occurs when you return data from ADO. The less data returned, the less marshaling happens and the faster the application will execute, so not returning a recordset drastically reduces the amount of data. This option is also handy when you need to execute either a stored procedure or a SQL statement that performs update, delete, or insert commands that do not require a recordset. Using the adExecuteNoRecords option is a simple way to really improve the performance of your applications. See http://www.microsoft.com/Data/ImpPerf.htm for more information. The Database.bas module in FMStocks_DB.vbp contains the general-purpose database code for the application. The GetDSN function (called to set the connection in Figure 4) returns the DSN string. |
|
The DSN is hardcoded to maximize performance. Since the database objects will be instantiated over and over again, this code adds a small performance gain by hardcoding the DSN instead of placing it in the registry or using a System DSN. The application does not need to look for the DSNit's right there. This does, of course, make the application slightly less portable as you must change the DSN and recompile the code to move it to another server.
Database.bas also contains the following general-purpose execution functions:
Now, you might be wondering why the function returns a recordset and not an array or a delimited string. During the test phase, the developers tried creating delimited strings in the procedure and returning those instead. That turned out to be slower than simply returning the disconnected recordset. This is probably due to the string-handling performance in Visual Basic, which can be slow depending upon how you use it and what you are doing. The code in Figure 5 shows how to generate a disconnected recordset. There are only a couple of things to be aware of when using disconnected recordsets. The first is the type of cursor you use. Disconnected recordsets require the use of client-side cursors because the disconnected recordset is handled on the client. This property is set in Figure 5 with the following line of code: |
|
The next few lines of code open the recordset (with the Open method), then destroy the Connection object and the recordset's connection: |
|
The last line of the method returns the recordset as the return value of the procedure. Later, I will show how to work with this recordset in ASP.
RunSP, RunSPWithVariant, and RunSPWithString execute stored procedures, but do not return recordsets; they return either parameters or a single return value. These functions work almost exactly like the VerifyUser method shown earlier. It is also interesting to note how the ADO objects are instantiated in this application. Each is created using the CtxCreateObject function. The Context module contains this and a few other functions that are used to perform MTS functions. The code for CtxCreateObjext is used to create objects within the MTS context for an objector if the component is not running under MTS, the code uses CreateObject: |
|
This function obtains a reference to the Context object, then creates the object using CreateInstance. If the Context object is Nothing, then the code is not running in MTS and CreateObject is used normally. CtxSetAbort and CtxSetComplete are used for convenience to issue either SetAbort or SetComplete methods. |
Figure 6: Setting MTS Package Type |
The Vertigo team discovered another tip for using MTS. When you create a new MTS package, it is set to a Server Package by default, as shown in Figure 6. This causes the objects in the package to run in the mtx.exe process for the package. The mtx.exe process for a Server Library runs in its own process. If an object is called from an ASP application, then the mtx.exe process is running in a separate process from the mtx.exe that is running the IIS process. Changing this setting to Library Package causes the package to run inside of the calling application's process, which is IIS in this case. This eliminates the need for any marshaling. The result: speed! But, if the component dies, IIS will most likely die also. Beware.
Now let's look at one other aspect of this application. When a method is executed that returns a recordset, the ASP code simply treats the return value like any ADO recordset. The following code is from TickerList.asp, and demonstrates several aspects of working with the objects. The first part of the GetTickerList procedure creates several variables, then creates an instance of the object: |
|
Next, the code executes either the ListByTicker or ListByCompany methods: |
|
The return value is stored in the rs variable. Now, rs contains a recordset just as if it were set using the recordset object's Open method. You can see this in the next few lines of code. The If statement is recommended because it checks for a valid object. If this check is not made, any code trying to access the recordset will fail if is not returned correctly by the method: |
|
The next three lines use standard ADO recordset features: |
|
The balance of the code in this procedure (see Figure 7) accesses data in the recordset and loops through the recordset. The code also sets each object to Nothing as soon as it is finished performing its tasks. This assures that the resources used by the object can be destroyed and its resources released. It is true that objects will be recycled when they are no longer used; it is much more efficient to destroy the object as soon as you are finished using it. Conclusion
What does this application tell you about performance? It says that you should pay attention to the requirements of any application you create. If you are building an intranet application, then you should understand its user requirements. If the application requires 100 or 200 simultaneous users, then you can use many of the tips discovered during the construction of this application while still taking advantage of the easy-to-use and easy-to-maintain features of Visual InterDev 6.0 and ASP. |
From the August 1999 issue of Microsoft Internet Developer.
|