Creating a Usage Profile for Site Capacity Planning

September 1999

Microsoft Corporation

Introduction

On a Microsoft® Site Server 3.0 Commerce Edition (SSCE) site, performance analysis and capacity planning go hand-in-hand. However, it is important to understand the contrasting perspectives of these two activities. Performance analysis addresses the technical aspect of the site, focusing on performance metrics such as ASP throughput and ASP latency. Capacity planning addresses the business perspective, focusing on maximum shopper capacity. 

Of the two, performance analysis is the more straightforward. When looking at the performance of an SSCE site, for example, it is relatively easy to measure the maximum number of checkout transactions per second and make comparisons with other SSCE sites. Analyzing shopper capacity is a little more complex. Before you can predict maximum shopper capacity, you first need to profile shopper behavior, then use the shopper profile information in conjunction with your performance metrics to calculate capacity.

Shopper behavior varies from site to site, depending on the richness of the shopping experience (page design, site design, and response time), as well as the type of products being sold and the breadth of the product line. One site might support 1,000 shoppers, while another site with identical hardware might only support 200 shoppers due to the differences in shopper behavior.

How do you profile shopper behavior on your SSCE site? By providing answers to questions that identify which shopper operations are being performed and how often they are performed over a given period of time. For example, a usage profile is designed to answer the following types of questions:

Creating a usage profile is the first step in determining shopper capacity for a SSCE site. The key components of a usage profile are:

Once you have completed your usage profile, you can use Transaction Cost Analysis (TCA) methodology to determine how a single shopper session impacts your computer resources. You can then project how many shopper sessions you can run on your site before your computer resources are operating at capacity. 

The first step in creating a usage profile is to construct usage profile reports from which to do your analysis. This document describes how to:

For a more detailed explanation of how to use the TCA methodology on an SSCE site, see "Microsoft Site Server 3.0 Commerce Server Capacity and Performance Analysis" and "Using Transaction Cost Analysis for Site Capacity Planning," both of which are located in the Site Server Commerce Edition 3.0 Resource Kit at: http://www.microsoft.com/siteserver/commerce/.

Creating Usage Profile Reports

You create Usage Profile Reports from Microsoft® Internet Information Server (IIS) logs, using the Usage Import and Report Writer programs provided with Site Server 3.0 Commerce Edition.

Note   When you install Site Server, be sure to select Microsoft® SQL Server™ as your database, and not Microsoft® Access.

To create Usage Profile Reports:

Creating the Analysis Database

This section describes how to:

Calculating Database Size

The first step in creating an analysis database is to calculate its size. Database size is based on the total number of log files you want to store. In general, you should store at least one week’s worth of log file data to provide a good data sample. If you have multiple Web servers generating logs, you should store the logs from all servers so you can reconcile sessions that span servers.

To calculate your database size

  1. Determine what types of reports you plan to run by month, by quarter, or by some other time period.

  2. Estimate the size of the logs you plan to import during the specified period.

  3. Double that value to obtain an estimate of the number of bytes you need for database tables, logs, and indexes.

For a more thorough explanation of the calculations used to determine the size of your database, see "How to Estimate the Database Size for Site Server Analysis," located at http://support.microsoft.com/support/kb/articles/q216/5/59.asp.

Creating the Database

Use Microsoft SQL Server Enterprise Manager and your calculated database size to create the database and log device in which to store your analysis data. The Database Setup Wizard can do this for you. However, you will get better results by doing it manually.

Note   If you used the values in the Site Server installation manual to create your database, delete that database and recreate it using your specific size calculations.

Initializing the Database

Once you have created your database, use the Site Server Database Setup Wizard to initialize your database for use with the Usage Import and Report Writer programs. You can’t import logs until this is done.

To run the Site Server Database Setup Wizard

  1. On the Start menu, select Programs / Microsoft Site Server / Analysis / Database Setup. The Database Setup Wizard Screen appears.

  2. Click Next. The Choose a Setup Mode screen appears.

  3. Click Update an existing database, and click Next. The SQL Server Login screen appears. Figure 1 illustrates the SQL Server Login screen with sample entries.

    Figure 1: SQL Server Login Screen

  4. Enter your login information, and click Connect. The Confirm Your Database Setup screen appears.

  5. If the information shown is correct, click Finish to update your database with the schema necessary to run the analysis utilities. If the information is not correct, repeat Steps 1-4 until it is correct.

Importing Log Files

When you have created your analysis database, you can start importing log files and other data. When you do this, it is a good idea to copy the log files to a dedicated server so that you don’t affect the production server’s performance or network traffic. You can copy the files manually, of course, but it is more efficient to use File Transfer Protocol (FTP) to copy the files. You can use the Scheduler utility from the Usage Import program to schedule the file transfers using FTP, then import the files and run the desired reports.

To import log files

  1. On the Analysis menu, click Usage Import. When you run the Usage Import program for the first time, it prompts you for a database connection.

  2. Enter the connection information for the analysis database. (Once you start running the Usage Import program, you can point to a different database or view the current database by selecting Open Database from the File menu.)

Creating a New Log Data Source

To view a set of logs from a new source (Web site), you must first create a new Log Data Source, and set up a server and a site.

To create a new Log Data Source

  1. On the View menu, click Service Manager.

  2. On the Edit menu, click New Log Data Source. The Log Data Source Properties dialog box shown in Figure 2 appears.

    Figure 2: The Log Data Source Properties Dialog Box

  3. Click Auto Detect, even if you know the log type, and click OK. The Properties dialog box for the default server appears.

  4. Set the properties as follows, and click OK:
    Property Setting
    Server type World Wide Web.
    Directory index files Names of the default pages for the site.
    IP address No entry. Only enter an IP address if you are using one log file to store data for several virtual servers.
    IP port 80, unless the default port is another value.
    Local time zone Casablanca. GMT provides correct offsets of the report. Casablanca uses daylight savings time. You use the Import Options dialog box described in the "Setting the Import Options" section to enter the appropriate offset for the log files’ time zone.
    Local domain No entry.

    The Site Properties dialog box for the default site shown in Figure 3 appears.

    Figure 3: Site Properties Dialog Box for the Default Site

  5. In the Properties dialog box, select the Basic tab and enter the following:
    Property Setting
    Home page URLs Web site addresses for this site.
    URL path extension for this site No entry, unless you want to check only a portion of a site. If you want to check only a portion of a site, enter the path to the section of the site you want to analyze.

  6. In the Properties dialog box, select the Inferences tab shown in Figure 4.

    Figure 4: The Inferences Tab

  7. On the Inferences tab, enter the following:
    Field Setting
    Request algorithm Select this check box. If there is no referring URL, this algorithm assumes the last page the user was on (that is, assumes the user selected the Back button).
    Visit algorithm The number of minutes to wait before ending an idle user session.
    User algorithm No entry. Selecting this box causes Usage Import to ignore the Username field.

Setting the Import Options

Before importing a log file or performing look-ups, you must set the import options.

To set the import options

  1. On the Tools menu, click Options. The Import Options dialog box shown in Figure 5 appears.

    Figure 5: The Import Options Dialog Box

  2. On the Import tab, enter the following:
    Field Setting
    Drop database indexes before import No entry. If you select this box, the indexes are dropped each time an import occurs, causing poor performance.
    Adjust requests timestamps to Select this check box, and select the time zone for the location of the log files.
    Use cookies for inferences Select this check box if you want to track cookie usage on your Web site. Otherwise, no entry.
    Save query strings with URI No entry. You can select this check box to try to improve the ability of HTML title look-ups to work on dynamic pages; however, this feature can decrease system performance.
    Start day of week Day of the week on which to start importing logs.
    After Import fields No entry. In general, you should run these look-ups only once—after you have finished all your imports, because they take a long time to run. Use the Tools menu to run these look-ups manually when you have finished importing data.

  3. Select the IP Servers tab shown in Figure 6.

    Figure 6: The IP Servers Tab

  4. Enter the proxy information needed for the look-ups. You might also need to have a Winsock proxy client running on your computer to access the Internet.

    Note   If you are running inside a firewall, you need to have ports 43 and 53 open.

Importing the Files

When everything has been set up, you can begin importing log files.

To import log files

  1.  On the View menu, click Import Manager. The Microsoft Site Server Analysis—Usage Import—[Import Manager] screen shown in Figure 7 appears.

    Figure 7: Import Manager Screen

  2. To select files for import, enter the following information:
    Field Setting
    Log data source Your log data source.
    Log access File://
    Log path The selected file path.
    Browse button Select the log files for import. You can select as many files as you want, but you must have enough RAM to load all of the files you select.

    If you want to import several large files but can’t do it during one session, you can use the Scheduler on the Tools menu to run the imports at another time.

    Add to list button Add the selected files to the list of files to be imported.

  3. Click Start Import to begin the import.

    Note   The import may take several hours to complete.

  4. To view the results of the import, on the View menu, click Import History Manager.  

Improving Import Performance

You can improve the performance of your imports with the help of SQL script files located in the Appendix:

After initializing your analysis database, run the scripts against the database, in the order in which they appear in the Appendix.

Tuning Tips for Microsoft SQL Server 6.5

If you are using Microsoft SQL Server 6.5, you can further improve performance by doing the following:

Change… To…
Default Index Fill Factor 20%
Max Asynch I/O 255
Max Sort Pages 255
MEM used 3/4 of available RAM
LE Threshold count 5000
LE Threshold % 20
Hash Buckets 10000

Tuning Tips for Microsoft SQL Server 7

If you are using Microsoft SQL Server 7, you can further improve performance by doing the following:

Change… To…
Index Fill factor 20% (from auto)

Adding Look-up Information

After importing your log files, select the three look-ups that can be run to add more descriptive information to your reports.

To add look-ups

Creating Analysis Reports

When all of the data has been imported, close Usage Import and run Report Writer.

To run Report Writer

  1. On the File menu, click Open Report Writer Catalog. The Report Writer screen appears. You can create many reports from this catalog. For example, the Comprehensive Site Analysis Report in the Detail folder can tell you just about everything you need to know about your site.

  2. To create the User Profile Report, select the Request detail report in the Detail folder.

  3. If you want to run the report against all of the data in the database, click Run Now.

    If you want to specify report parameters, click Next.

  4. When the report has been generated, view the chart titled, "Top 10% of all requests." This chart lists the Web pages on your site that are visited most often. This should match the chart shown in the "Set Up a User Profile" section in the document "Using Transaction Cost Analysis for Site Capacity Planning" which is located in the Site Server 3.0 Commerce Edition Resource Kit.

Performing Weekly Maintenance

Weekly maintenance is vital to ensure the stability of your site and to keep it running smoothly. Performing weekly maintenance also ensures that you get the best possible performance during log file imports and report generation. Maintenance should be performed immediately after you set up your database, then once a week thereafter. Maintenance should consist of the following steps:

  1. Back up the database.

  2. Run the following utility to improve import and report generation performance:
    c:\mssql\binn\sqlmaint sqlmaint -S servername -U sa -P password -D dbname –Rpt c:\temp.txt -RebldIdx 20


Note   Every field is required and the utility is case sensitive.

Creating a Usage Profile

To illustrate the process of creating a usage profile, this section describes the procedures we used on an actual customer site. We created a usage profile by doing the following:

Using the Transaction Cost Analysis methodology, we then used the profile to project shopper capacity for the site. Finally, we simulated shopper behavior for the site by creating an InetMonitor test script based on the usage profile. We used the InetMonitor test tool to run the script and generate shopper load on the site to confirm capacity predictions.

Analyzing Usage Data from IIS Logs

The primary resources for identifying shopper operations are Internet Information Server (IIS) log files, located at the root directory:\Winnt\System32\LogFiles\W3SVC1. IIS log files track hits generated by shopper requests performed on a site.

The following listing shows three entries extracted from the customer site log files. Note that each entry includes the date and time the request was performed, the source of the request (IP address), and the syntax of the request.

Listing 1: Sample IIS Log Entries

6/2/99   0:00:26   12.10.121.92 -   GET /samplesite/mainstore/department.asp sid=6RCJVP4NVKS12G4G0017QAE163R7BPL5&wdid=298 200 0 599 HTTP/1.1 Mozilla/4.0+(compatible;+MSIE+4.01;+Windows+98) SHOPPERMANAGER%2FSAMPLE=USERTYPE=R&VERSION=0&SHOPPERMANAGER%2FSAMPLE=6RCJVP4NVKS12G4G0017QAE163R7BPL5&FST%5FNM=Jon55%40hotmail%2Ecom&WEB%5FMEM%5FREGR%5FIND=%2D1;+SITESERVER=ID=7eb4bb7cc8e02b1bccf2c7e5aae145ba http://www.sample.com/samplesite/mainstore/department.asp?wdid=300&sid=6RCJVP4NVKS12G4G0017QAE163R7BPL5

6/2/99   0:00:27   12.10.121.92 -   GET /samplesite/mainstore/images/header.gif - 200 3383 551 HTTP/1.1 Mozilla/4.0+(compatible;+MSIE+4.01;+Windows+98) SHOPPERMANAGER%2FSAMPLE=USERTYPE=R&VERSION=0&SHOPPERMANAGER%2FSAMPLE=6RCJVP4NVKS12G4G0017QAE163R7BPL5&FST%5FNM=Jon55%40hotmail%2Ecom&WEB%5FMEM%5FREGR%5FIND=%2D1;+SITESERVER=ID=7eb4bb7cc8e02b1bccf2c7e5aae145ba http://www.sample.com/samplesite/mainstore/department.asp?sid=6RCJVP4NVKS12G4G0017QAE163R7BPL5&wdid=298

6/2/99   0:00:36   12.10.121.92 -   GET /samplesite/mainstore/product.asp sid=6RCJVP4NVKS12G4G0017QAE163R7BPL5&wpid=44 200 0 641 HTTP/1.1 Mozilla/4.0+(compatible;+MSIE+4.01;+Windows+98) SHOPPERMANAGER%2FSAMPLE=USERTYPE=R&VERSION=0&SHOPPERMANAGER%2FSAMPLE=6RCJVP4NVKS12G4G0017QAE163R7BPL5&FST%5FNM=Jon55%40hotmail%2Ecom&WEB%5FMEM%5FREGR%5FIND=%2D1;+SITESERVER=ID=7eb4bb7cc8e02b1bccf2c7e5aae145ba http://www.sample.com/samplesite/mainstore/department.asp?sid=6RCJVP4NVKS12G4G0017QAE163R7BPL5&wdid=298

A number of conclusions can be drawn from the data in this listing, using just these three pieces of information:

The date and time, the IP address, and the syntax of the IIS can provide enough information to complete a usage profile for an SSCE site.

Analyzing Usage Data from Usage Analysis Reports

Since IIS logs can be very large and unwieldy, usage analysis reports are often a better resource. This is the route we ultimately chose to follow. SSA imports data from the IIS logs and generates a number of useful listings, one of the most helpful of which is a list of the most commonly requested pages on the site.  

Table 1 shows an abridged listing of the most common ASP requests performed on our customer site over a period of one month. IIS actually records any kind of hit made on the IIS server, including HTML requests, inline images, and errors. On this site, however, all pages are ASP (no HTML). For simplicity, we deleted all entries showing counts for inline images and errors from the listing. Nevertheless, the counts shown in Table 1 represent 94.33% of all requests made to this site.

Table 1: Most commonly performed ASP requests for the sample site

ASP Requests Counts % Total
/Samplesite/Mainstore/Default.asp 8,105,385 33.06%
/Samplesite/Mainstore/Department.asp 6,543,105 26.69%
/Samplesite/Mainstore/Product.asp 5,864,317 23.92%
/Samplesite/Mainstore/Basket.asp 1,323,079 5.40%
/Samplesite/Mainstore/Add-item.asp 505,015 2.06%
/Samplesite/Mainstore/Sign-in.asp 206,585 0.84%
/Samplesite/Mainstore/Welcome.asp 156,439 0.64%
/Samplesite/Mainstore/Register.asp 145,428 0.59%
/Samplesite/Mainstore/Checkout.asp 65,309 0.27%
/Samplesite/Mainstore/Order-bill.asp 65,205 0.27%
/Samplesite/Mainstore/Order-ship.asp 59,581 0.24%
/Samplesite/Mainstore/Order-summary.asp 56,128 0.23%
/Samplesite/Mainstore/Confirm.asp 27,538 0.11%
Total 23,155,769 94.33%

Identifying Shopper Operations

When you view IIS logs or usage analysis reports, it is important to recognize that one shopper operation can be composed of multiple ASP requests, which are listed as separate entries. At our sample customer site, for example, five separate shopper requests are required to complete the purchase of a product item, as shown in Figure 8.

Figure 8: Shopper requests required to complete a purchase at our customer site

It is also important to recognize that a shopper request can generate more than one ASP request. Figure 9 shows that when a shopper makes a request to add a product to the shopping basket, two ASP requests are actually generated. The first ASP processes the add request, and the second ASP displays the contents of the shopping basket to the shopper. The first ASP calls the second ASP, using a Response.Redirect command.

Figure 9: ASP requests required to complete an add-to-basket request at our customer site

If the goal is to convert a list of the most commonly performed ASP requests into a list of the most commonly performed shopper operations, some investigation is required. One way to do this investigation is to use a network analyzer (such as the Microsoft Windows NT® Network Monitor) to view packets transmitted between the Web browser and the SSCE server while the shopper browses the SSCE site. Network traces can be captured for each shopper operation and the contents of the trace analyzed to determine which ASP requests are performed during each operation.

Listing 2 shows summary information from a Network Monitor trace obtained while performing a Checkout operation on our customer site. HTTP GET and HTTP POST requests are identified in the Description column. This listing shows that two of the first ten frames (frame #1 and frame #10) are HTTP GET requests. These frames require a closer look to determine which ASP pages are being requested.

Listing 2: Sample Network Monitor trace:  Summary

Frame  Time    Src MAC Addr   Dst MAC Addr   Protocol   Description
1      1.956   00A0C9FE93DF   00902710C60E   HTTP       GET Request (from client using 
2      2.099   00902710C60E   00A0C9FE93DF   TCP        .A...., len:   0, seq:  12324351
3      2.208   00902710C60E   00A0C9FE93DF   HTTP       Response (to client using port 1
4      2.208   00902710C60E   00A0C9FE93DF   HTTP       Response (to client using port 1
5      2.208   00A0C9FE93DF   00902710C60E   TCP        .A...., len:   0, seq:     75226
6      2.208   00902710C60E   00A0C9FE93DF   HTTP       Response (to client using port 1
7      2.208   00902710C60E   00A0C9FE93DF   HTTP       Response (to client using port 1
8      2.209   00A0C9FE93DF   00902710C60E   TCP        .A...., len:   0, seq:     75226
9      2.209   00902710C60E   00A0C9FE93DF   HTTP       Response (to client using port 1
10     2.259   00902710C60E   00A0C9FE93DF   HTTP       GET Request (from client using p
.
.
.

The abbreviated listing in Listing 3 shows the detail output for frame # 1, which reveals that the HTTP GET request is performed on Checkout.asp.

Listing 3: Sample Network Monitor trace:  Detail

00000030                    47 45 54 20 2F 73 61 6D 70 6C       GET./sampl
00000040  65 73 69 74 65 2F 6D 60 69 6E 73 74 6F 72 65 2F esite/mainstore/
00000050  43 68 65 63 6B 6F 75 74 2E 61 73 70 3F 73 69 64 Checkout.asp?sid
00000060  3D 46 55 57 55 4D 47 4C 45 33 58 53 31 32 47 30 =FUWUMGLE3XS12G0
00000070  38 30 30 4A 50 34 32 47 51 31 4E 53 43 35 4D 4E 800JP42GQ1NSC5MN
00000080  45 20 48 54 54 50 2F 31 2E 31 0D 0A 41 63 63 65 E.HTTP/1.1..Acce
00000090  70 74 3A 20 69 6D 61 67 65 2F 67 69 66 2C 20 69 pt:.image/gif,.i
000000A0  6D 61 67 65 2F 78 2D 78 62 69 74 6D 61 70 2C 20 mage/x-xbitmap,.
000000B0  69 6D 61 67 65 2F 6A 70 65 67 2C 20 69 6D 61 67 image/jpeg,.imag
000000C0  65 2F 70 6A 70 65 67 2C 20 2A 2F 2A 0D 0A 52 65 e/pjpeg,.*/*..Re
000000D0  66 65 72 65 72 3A 20 68 74 74 70 3A 2F 2F 73 61 ferer:.http://sa
000000E0  6D 70 63 69 30 39 2F 6F 6E 6C 69 6E 65 73 74 6F mple01/samplesit
000000F0  72 65 2F 67 61 70 73 74 6F 72 65 2F 53 68 6F 70 e/mainstore/Shop
00000100  70 69 6E 67 5F 42 61 67 2E 61 73 70 3F 73 69 64 ping_Bag.asp?sid
.
.
.

After completing an analysis of the network trace for the Checkout operation, we observed that nine ASP requests were performed, with five requests originated by the shopper and four additional requests performed as a result of ASP redirects. Table 2 summarizes these findings.

Table 2: Summary of requests for the Checkout operation

ASP Request ASP Redirect
Get Checkout.asp Get Order-bill.asp
Post Order-bill.asp Get Order-ship.asp
Post Order-ship.asp Get Order-summary.asp
Post Order-summary.asp Get Confirm.asp

Once we determined which ASP requests were required to perform a single Checkout operation, we looked at the usage data in Table 1 to determine how many Checkout operations were actually performed. This data is shown in Table 3, which shows that a total of 32,655 Checkout operations were initiated, and 27,538 operations were completed.

Table 3: Pages viewed during the Checkout operation

ASP Request Counts Requests per Operation Total Operations
/Samplesite/Mainstore/Checkout.asp 65,309 2 32,655
/Samplesite/Mainstore/Order-bill.asp 65,205 2 32,603
/Samplesite/Mainstore/Order-ship.asp 59,581 2 29,791
/Samplesite/Mainstore/Order-summary.asp 56,128 2 28,064
/Samplesite/Mainstore/Confirm.asp 27,538 1 27,538

Ultimately, all ASP requests listed in Table 1 were mapped to shopper operations. The results are shown in Table 4.

Table 4: Sample site shopper operations

Shopper Operation Description Page Requests Redirects Total Page Views
Add item Add item to shopping basket 1 1 2
Checkout Purchase items in shopping basket 5 4 9
Department View product list for a department 1 0 1
Home Home page (default) 1 0 1
Product View product information for a specific product 1 0 1
Register Register a new shopper 2 1 3
Shopping basket View contents of shopping basket 1 0 1
Sign in Sign in a registered shopper 2 1 3

Calculating Percentages for Each Shopper Operation

After identifying the ASP requests belonging to each shopper operation, we calculated the number of occurrences of each shopper operation in terms of the number of operations performed. We represented this as a percentage of the total, as shown in Table 5.

Table 5: Sample Site Shopper Operations

Shopper Operation Operations %Total Shopper Requests ASP Redirects Total Requests
Add Item 505,015 2.27% 505,015 505,015 1,010,030
Checkout 27,538 0.12% 155,662 117,995 273,657
Department 6,543,105 29.45% 6,543,105 0 6,543,105
Home 8,105,385 36.49% 8,105,385 0 8,105,385
Product 5,864,317 26.40% 5,864,317 0 5,864,317
Register 145,428 0.65% 145,428 64,630 210,058
Shopping basket 818,064 3.68% 818,064 0 818,064
Sign in 206,585 0.93% 206,585 91,809 298,394
Total 22,215,437 100.00% 22,343,561 779,449 23,123,010

Calculating Session Length and Number of Shopper Operations Per Session

We then went back to the usage analysis report to calculate session length and the number of shopper operations per session. Table 6 shows general usage statistics from our report for May 1999.

Table 6: General usage statistics for the customer site

Date and time this report was generated Tuesday June 01, 1999–07:23:27
Timeframe 05/01/99 00:00:00 - 05/31/99 23:59:59
Total number of hits 28,384,441
Total number of requests 24,513,835
Total number of visits 1,566,904
Session length 00:12:27

Note that the shopper session length is shown to be 12 minutes and 27 seconds. A session is considered complete when a shopper remains inactive for a specified period of time (in this case, for more than 30 minutes).

We then used the data found in Table 5 to calculate the number of requests per operation. This number was used to calculate shopper operations per session, as shown in Table 7.

Table 7: Usage profile calculations for the customer site

Requests per visit (calculated from Table 6) 15.645
Total ASP requests (from Table 5) 23,123,010
Total shopper operations (from Table 5) 22,215,437
Requests per operation 1.04
Operations per session 15.02

Calculating Frequencies for Each Shopper Operation

We were then able to complete our usage profile table by calculating the average number of occurrences of each shopper operation during a shopper session, using the percentages calculated in Table 5, as shown in Table 8.

Table 8: Usage profile for the customer site

Shopper Operation Operations per Session Percent Total
Add Item 0.34 2.27%
Checkout 0.02 0.12%
Department 4.42 29.45%
Home 5.48 36.49%
Product 3.96 26.40%
Register 0.10 0.65%
Shopping Basket 0.55 3.68%
Sign In 0.14 0.93%
Total 15.02 100.00%

Calculating Shopper Capacity from the Usage Profile

Once we had created our usage profile, we were able to:

Projecting Shopper Capacity

Our next step was to use TCA methodology to calculate the cost (in terms of CPU usage) of a single shopper session. First, we measured performance and calculated transaction cost per session for each shopper operation in the usage profile with the following equation:

CPU Cost per Session      =   CPU Cost per Operation   ×   Operations per Session

We then converted cost per session to cost per second with the following equation:

CPU Cost per Second      =   CPU Cost per Session      ÷   747 Seconds per Session 

Finally, we were able to calculate the cost of a shopper session by taking the sum of the costs for each shopper operation. Thus, if the cost per shopper is 1.19352 Mcycles per second, a 400MHz SSCE server has a theoretical capacity of 335 shoppers:

335.14 shoppers      =   400 Mcycles/sec   ÷   1.19352 Mcycles/sec per shopper

Table 9 illustrates these calculations for each shopper operation, as well as the calculation for total cost per shopper session.

Table 9: Calculating cost per shopper session

Operation CPU cost per operation (Mcycles) Frequency of Operations per Session Cost per Session (Mcycles) Cost per Second (Mcycles)
Add Item 270.96 0.34 92.13 0.12333
Checkout 1228.98 0.02 24.58 0.03290
Department 47.20 4.42 208.15 0.27865
Home 27.86 5.48 152.39 0.20400
Product 81.71 3.96 323.57 0.43316
Register 64.55 0.10 6.46 0.00865
Shopping Basket 136.64 0.55 75.15 0.10060
Sign In 65.22 0.14 9.13 0.01222
Total 15.02 891.56 1.19352

Testing and Validating Capacity Projections

We used the InetMonitor test tool to verify the shopper capacity projections generated by the Transaction Cost Analysis methodology, simulating shopper load on the SSCE server. Listing 4 shows an InetMonitor usage profile script based on the usage profile created in this document. One iteration of this script is equivalent to a single shopper performing an average of 15.02 shopper operations over an average period of 12-1/2 minutes.

We then measured resource usage for simulated loads of 50 to 300 shoppers (in increments of 50) to monitor resource usage trends and to determine whether the SSCE server can support the predicted shopper capacity.   

Listing 4: Sample InetMonitor validation script

REM *** Home Page (5.48) ***
LOOP 5
get url:/samplestore/mainstore/default.asp
SLEEP RANDNUMBER(35000,75000)
ENDLOOP
%52 SKIP 2
get url:/samplestore/mainstore/default.asp
SLEEP RANDNUMBER(35000,75000)

REM *** Product Browse (3.96) ***
LOOP 3
GET url:/samplestore/mainstore/product.asp?wpid=1
SLEEP RANDNUMBER(45000,90000)
ENDLOOP
%4 SKIP 2
GET SEQULIST(TABLE_PROD_ID.TXT)
SLEEP RANDNUMBER(45000,90000)

REM *** Department (4.42) ***
LOOP 4
POST SEQULIST(table_dept_id.txt)
SLEEP RANDNUMBER(35000,75000)
ENDLOOP
%58 SKIP 2
POST SEQULIST(table_dept_id.txt)
SLEEP RANDNUMBER(35000,75000)

REM *** Sign In (0.14) ***
%86 SKIP 2
get url:/samplestore/mainstore/sign_in.asp
SLEEP RANDNUMBER(35000,75000)
post sequlist(table_old.txt)
SLEEP RANDNUMBER(35000,75000)

REM *** Add Item (0.34 – 0.02 = 0.32) ***
%68 SKIP 2
POST url:/samplestore/mainstore/product.asp?wpid=1&m=2&Size_10=28&
  style_color_code0=111036022&Quantity0=1&Size_20=28&SHIPTO=0
SLEEP RANDNUMBER(35000,75000)

REM *** Register (0.10) ***
%90 SKIP 2
POST url:/samplestore/mainstore/OrganizerQuickRegister.ASP
SLEEP RANDNUMBER(35000,75000)
GET sequlist(table_new.txt)
SLEEP RANDNUMBER(35000,75000)

REM *** Shopping Basket (0.55) ***
%45 SKIP 2
get url:/samplestore/mainstore/shopping_bag.asp
SLEEP RANDNUMBER(35000,75000)

REM *** Add Item (.02) + Checkout (0.02) ***
%98 SKIP 12
url:/samplestore/mainstore/product.asp?wpid=2&m=2&Size_10=28&Size_20=28&
  style_color_code0=111034002&Quantity0=1&SHIPTO=0
SLEEP RANDNUMBER(45000,90000)
get url:/samplestore/mainstore/checkout.asp
SLEEP RANDNUMBER(45000,90000)
get url:/samplestore/mainstore/checkout.asp?checkout=online
SLEEP RANDNUMBER(45000,90000)
post url:/samplestore/mainstore/Order-bill.asp?POSTED=TRUE&shp_ctry_nm=USA&
 bill_ctry_nm=USA&otType=0&bill_fst_nm=Joe&bill_mid_ini=&bill_lst_nm=Johnson&
 bill_addr_ln1_txt=5+5th+St&bill_addr_ln2_txt=&bill_city_nm=Seattle&bill_st_prov_cd=WA&
 bill_addr_pstl_cd=98053&dy_phn_nm=425-882-8080&eml_id=joes@ms.com&eml_freq_cd=00&
 web_mem_psw=&confirm_web_mem_psw=&web_mem_psw_qstn=&web_mem_bpl=
SLEEP RANDNUMBER(45000,90000)
post url:/samplestore/mainstore/Order-ship.asp?otType=0&newRegistration=False&
 POSTED=TRUE&shp_ctry_nm=USA&bill_ctry_nm=USA&ShipToCount=1&NKNMO=you&addr_typ_seq_nbr0=0&
 SAME_ADDR_IND0=checked&shp_fst_nm0=Joe&shp_mid_ini0=&shp_lst_nm0=Johnson&
 shp_addr_ln1_txt0=&shp_addr_ln2_txt0=&shp_city_nm0=&shp_st_prov_cd0=1&shp_addr_pstl_cd0=&
 shp_phn_nm0=425-882-8080&gift_cd0=00&spd_shp_tier_id=E
SLEEP RANDNUMBER(45000,90000)
post url:/samplestore/mainstore/Order_Summary.asp?POSTED=TRUE&cc_cd=01&
cc_number=4111-1111-1111-1111&month=01&year=2001
SLEEP RANDNUMBER(45000,90000)

Appendix

This section contains the code samples of SQL Server script files, which you can use to add, drop, and update indices:

These indices are for the Analysis tables and are used for usage import and report creation.

Addindices.sql

The Addindices.sql script recreates the indices dropped by the Dropindices.sql script, changing the index type from clustered to non-clustered. Cut and paste the following text into an *.sql file to add indices:

ALTER TABLE tblImportCookie
ADD CONSTRAINT xCookieName Primary Key NONCLUSTERED (CookieName)
go

ALTER TABLE tblImportHostname
ADD CONSTRAINT xHostname Primary Key NONCLUSTERED (Hostname)
go

ALTER TABLE tblImportLevel1Dir
ADD CONSTRAINT xDirName1 Primary Key NONClUSTERED (DirName, Level1DirID)
go

ALTER TABLE tblImportLevel2Dir
ADD CONSTRAINT xDirName2 Primary Key NONCLUSTERED (DirName, Level1DirID)
go

ALTER TABLE tblImportLevel3Dir
ADD CONSTRAINT xDirName3 Primary Key NONCLUSTERED (DirName, Level2DirID)
go

ALTER TABLE tblImportLevel4Dir
ADD CONSTRAINT xDirName4 Primary Key NONCLUSTERED (DirName, Level3DirID)
go

ALTER TABLE tblImportLevel5Dir
ADD CONSTRAINT xDirName5 Primary Key NONCLUSTERED (DirName, Level4DirID)
go

ALTER TABLE tblImportLevel6Dir
ADD CONSTRAINT xDirName6 Primary Key NONCLUSTERED (DirName, Level5DirID)
go

ALTER TABLE tblImportOrganization
ADD CONSTRAINT xDomain Primary Key NONCLUSTERED (DomainName)
go

ALTER TABLE tblImportParameter
ADD CONSTRAINT xParam Primary Key NONCLUSTERED (ParameterName)
go

print "tblImportParameter"

ALTER TABLE tblImportReferrer
ADD CONSTRAINT xURL Primary Key NONCLUSTERED (URL)
go

ALTER TABLE tblImportUserName
ADD CONSTRAINT xUsername Primary Key NONCLUSTERED (Username)
go

ALTER TABLE tblImportVersion
ADD CONSTRAINT xVersion Primary Key NONCLUSTERED (VersionName, ProductID, SecurityID)
go

ALTER TABLE tblInterseFilter
ADD CONSTRAINT xFilterName Primary Key NONCLUSTERED (FilterName)
go

ALTER TABLE tblInterseIP
ADD CONSTRAINT xIP Primary Key NONCLUSTERED (IP)
go

ALTER TABLE tblIntersePath
ADD CONSTRAINT xTableName Primary Key NONCLUSTERED (TableName)
go

ALTER TABLE tblInterseProperty
ADD CONSTRAINT xPropertyName Primary Key NONCLUSTERED (PropertyName)
go

ALTER TABLE tblInterseSiteExt
ADD CONSTRAINT xSiteExtID Primary Key NONCLUSTERED (SiteID)
go

ALTER TABLE tblReferenceAdvertiser
ADD CONSTRAINT xAdvertiser Primary Key NONCLUSTERED (AdvertiserName)
go

ALTER TABLE tblReferenceDomain
ADD CONSTRAINT xDomainName Primary Key NONCLUSTERED (DomainName)
go

ALTER TABLE tblReferenceMediaExt
ADD CONSTRAINT xExt Primary Key NONCLUSTERED (ExtName)
go

ALTER TABLE tblReferenceTop
ADD CONSTRAINT xTop Primary Key NONCLUSTERED (TopName)
Go

Dropindices.sql

The Dropindices.sql script drops indices that map to non-sequential data (URLs, and so forth), changing them to a more suitable index type. Cut and paste the following text into an *.sql file to drop indices:

ALTER TABLE tblImportCookie DROP CONSTRAINT xCookieName
go

ALTER TABLE tblImportHostname DROP CONSTRAINT xHostname
go

ALTER TABLE tblImportLevel1Dir DROP CONSTRAINT xDirName1
go

ALTER TABLE tblImportLevel2Dir DROP CONSTRAINT xDirName2
go

ALTER TABLE tblImportLevel3Dir DROP CONSTRAINT xDirName3
go

ALTER TABLE tblImportLevel4Dir DROP CONSTRAINT xDirName4
go

ALTER TABLE tblImportLevel5Dir DROP CONSTRAINT xDirName5
go

ALTER TABLE tblImportLevel6Dir DROP CONSTRAINT xDirName6
go

ALTER TABLE tblImportOrganization DROP CONSTRAINT xDomain
go

ALTER TABLE tblImportParameter DROP CONSTRAINT xParam
go

ALTER TABLE tblImportReferrer DROP CONSTRAINT xURL
go

ALTER TABLE tblImportUserName DROP CONSTRAINT xUsername
go

ALTER TABLE tblImportVersion DROP CONSTRAINT xVersion
go

ALTER TABLE tblInterseFilter DROP CONSTRAINT xFilterName
go

ALTER TABLE tblInterseIP DROP CONSTRAINT xIP
go

ALTER TABLE tblIntersePath DROP CONSTRAINT xTableName
go

ALTER TABLE tblInterseProperty DROP CONSTRAINT xPropertyName
go

ALTER TABLE tblInterseSiteExt DROP CONSTRAINT xSiteExtID
go

ALTER TABLE tblReferenceAdvertiser DROP CONSTRAINT xAdvertiser
go

ALTER TABLE tblReferenceDomain DROP CONSTRAINT xDomainName
go

ALTER TABLE tblReferenceMediaExt DROP CONSTRAINT xExt
go

ALTER TABLE tblReferenceTop DROP CONSTRAINT xTop
Go

Updateindex.sql

The Updateindex.sql script rebuilds the index after you’ve added or deleted indices. If you don’t run updateindex.sql, the indices will exist, but won’t be used. (Alternatively, you can run sqlmaint.exe to update all indices.) Cut and paste the following text into an *.sql file to update the index:

Update Statistics tblImportCookie
go

Update Statistics tblImportHostName
go

Update Statistics tblImportLevel1Dir
go

Update Statistics tblImportLevel2Dir
go

Update Statistics tblImportLevel3Dir
go

Update Statistics tblImportLevel4Dir
go

Update Statistics tblImportLevel5Dir
go

Update Statistics tblImportLevel6Dir
go

Update Statistics tblImportOrganization
go

Update Statistics tblImportParameter
go

Update Statistics tblImportReferrer
go

Update Statistics tblImportUsername
go

Update Statistics tblImportVersion
go

Update Statistics tblInterseFilter
go

Update Statistics tblInterseIP
go

Update Statistics tblIntersePath
go

Update Statistics tblInterseProperty
go

Update Statistics tblInterseSiteExt
go

Update Statistics tblReferenceAdvertiser
go

Update Statistics tblReferenceDomain
go

Update Statistics tblReferenceMediaExt
go

Update Statistics tblReferenceTop
go

Information in this document, including URL and other Internet web site references, is subject to change without notice.  The example companies, organizations, products, people and events depicted herein are fictitious.  No association with any real company, organization, product, person or event is intended or should be inferred.  Complying with all applicable copyright laws is the responsibility of the user.  Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation.

Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document.  Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.

© 1999-2000 Microsoft Corporation.  All rights reserved.

Microsoft, MS-DOS, MS, Windows, and Windows NT are either registered trademarks or trademarks of Microsoft Corporation in the U.S.A. and/or other countries/regions.

The names of actual companies and products mentioned herein may be the trademarks of their respective owners.

NOTICE REGARDING INFORMATION AND SERVICES AVAILABLE IN THIS DOCUMENT. In no event shall Microsoft, and/or its respective suppliers be liable for any special, indirect or consequential damages or any damages whatsoever resulting from loss of use, data, or profits, whether in action of contract, negligence, or other tortious action, arising out of or in connection with the use or performance of any guidelines, documents, provision of or failure to provide services, or information available from this document.