Publishing Specific Terms | Meaning |
ASP | ASPs (Active Server Pages) are pages of information transmitted by the Publishing server to the client’s browser. ASPs are the building blocks of Content Management transactions. |
ASP requests/sec | Primary unit of measure for performance (also referred to as ASP throughput) for a Publishing server. Higher ASP throughput for a Publishing site will translate to greater user capacity. |
CISVC | Content Index Service performs the function of indexing published content. |
Latency | Amount of time required by the Publishing server to process a single ASP request. Higher ASP latency values result in slower performance for individual users. |
User Profile | Description of user behavior created by the site builder that is used to project user capacity. A User Profile defines how many users and what types of users connect to the server and the amount of time between each transaction. |
General Terms | Meaning |
Context switching | Rate at which one thread is switched to another (either within a process or across processes). |
Pentium Pro equivalent MHz (PPEM) | Unit of measure for processor work. A 200 Pentium Pro equivalent MHz (PPEM) is delivered by a 200 MHz Pentium Pro processor. A computer with two 200 MHz Pentium Pro processors will deliver a maximum of 400 PPEMs. |
The first step in analyzing the performance of this service is to identify the basic types of processes by which users interact with the service. In other words, what kinds of transactions do users perform (for example, browsing documents and submitting new documents) and what are the steps required to complete each type of transaction?
The next step is to replicate each user transaction with a script that is run by the InetMonitor simulator. The user load is gradually increased on the Publishing server (for a particular transaction type) until maximum performance (maximum ASP throughput) is achieved.
To better predict how this service will perform in an actual setting, three separate user profiles are defined for low-end, mid-line, and high-end sites. Each profile identifies the maximum number of concurrent users, the types of users, and how long it typically takes these users to perform transactions.
Consideration needs to be given to determine the scalability of the service. If, for example, it is determined that this service is processor bound, it must be determined how increasing the number of processors will increase capacity. Because tests have shown that memory, disk, and network resources are not limiting factors, this document will focus primarily on how CPU utilization affects performance. Performance is compared for 1-proc, 2-proc, and 4-proc servers to determine benefits.
CPU: | 4 x 200 MHz Pentium Pro |
Memory: | 512 MB of RAM |
Disk: | 1 x 4.3 GB SCSI |
Volumes: | C: (4.0 GB NTFS) |
Network: | 100 MB Switched Ethernet |
Software: | Microsoft® Internet Information Server (IIS) version 4.0, Microsoft® Site Server version 3.0 |
Note This server is a 4 x 200 Pentium Pro computer (4-proc). However, where noted, 1-proc and 2-proc configurations are also used.
Content Management is the Microsoft® Site Server version 3.0 Publishing feature for publishing documents, images, Web pages, or any other content on the Internet or a departmental intranet. Content Management makes it easy for content authors to post content to the Internet or a local intranet without being familiar with HTML. Documents are posted to the Publishing server and organized by document type. Administrators can then manage the content easily and reliably.
Content Management has three core elements that, when combined, enable you to view and publish content: content types, content attributes, and view pages. These elements are contained in two sample sites (CmSample and FpSample) that are included with Site Server 3.0.
Content management can be administered using the Web-based Administration (WebAdmin) interface. WebAdmin is a browser-based interface that enables you to filter the content you want to work with, create and manage content types and content attributes, and approve content for publication.
Both sample sites, CmSample and FpSample, are designed using Active Server Pages (ASPs). ASPs are interactive Web pages used by content authors and editors to submit, approve and deploy content. Available ASPs include:
The following illustration shows the different stages of Content Management, from the initial submission to administration and publishing.
CmSample is a two-tiered site built using an HTML editor. All pages in the site are accessible from the first page, known as the Welcome page (default.asp). The following list describes the five types of pages within CmSample that can be viewed from the Welcome page.
Page type | Description |
Informational | Used as a header page for a group of produced view pages. |
Produced view | Used to view published content. |
Submit | Used by content authors to post content to a server. |
Approval | Used by editors to review and approve submitted content. |
View | Used by content authors to view, tag, re-tag and delete submitted content. |
Table 1: List of 2nd tier CmSample pages
Name of page | CmSample ASP | Type of page |
Company | Company.asp | Informational |
Press releases | Pressrelease.asp | Produced view |
Case studies | Casestudy.asp | Produced view |
Awards | Awardview.asp | Produced view |
News | News.asp | Informational |
Headlines | Headlineview.asp | Produced view |
Job postings | Jobpostingview.asp | Produced view |
Products | Products.asp | Informational |
Data sheets | Datasheets.asp | Produced view |
White paper | Whitepaper.asp | Produced view |
Specifications | Specificationview.asp | Produced view |
Submit new content | Submit.asp | Submit |
Editorial view | Approve.asp | Approval |
My content | Cpview.asp | View |
Users who choose to view content can browse the informational and produced view pages. Content providers (users who submit content to the server) use the submit page to upload documents to the Site Server Publishing server. They can also use the view page to edit content attributes or delete their content after it has been posted. Content editors use the editorial view page to review content for approval. Content editors have the authority to approve content submitted by content providers. When required, content will not be posted to the server until approved by the content editor.
All CmSample pages are tested for maximum performance using the InetMonitor simulator. Additionally, other types of transactions can be performed within most pages and these transactions need to be tested as well.
Note InetMonitor scripts for each of these transactions can be found in Appendix E.
For example, four separate transactions can be performed (approve, delete, edit, and view) from the editorial view page. Conversely, the view content transaction is performed within many of the pages listed. Viewing content from one page versus another will not alter performance, so the view content transaction need only be tested once.
Table 2: CmSample transactions within 2nd tier pages
Name of page | Transactions |
Press releases | View content |
Case studies | View content |
Awards | View content |
Headlines | View content |
Job postings | View content |
Data sheets | View content |
Whitepaper | View content |
Specifications | View content |
Submit new content | Submit content |
Editorial view | Approve content
Delete content Edit content View content |
My content | Delete content
Edit content View content |
Profiles can be created to simulate potential real-world scenarios for use of the Publishing service. Using the InetMonitor scripts found in Appendix E, behavior of Web users can be simulated using the parameters in Tables 3, 4, and 5. PerfMon counters can then be tracked to measure performance and resource utilization for each scenario.
Table 3: Low end
Total concurrent users | 22 |
Users uploading content | 2 |
Users browsing content | 20 |
Browse time per page | 2.0 minutes |
Time to submit 1 doc | 5.0 minutes |
Table 4: Mid-line
Total concurrent users | 111 |
Users uploading content | 10 |
Users browsing content | 100 |
Users administering site | 1 |
Browse time per page | 2.0 minutes |
Time to submit 1 doc | 5.0 minutes |
Admin time per page | 1.0 minutes |
Table 5: High end
Total concurrent users | 555 |
Users uploading content | 50 |
Users browsing content | 500 |
Users administering site | 5 |
Browse time per page | 2.0 minutes |
Time to submit 1 doc | 5.0 minutes |
Admin time per page | 1.0 minutes |
Based on the hardware configurations and user profiles found in this document, the following assertions can be made about scaling and performance for a Site Server Publishing server with the CmSample site.
Note All statements made here pertain to a 4-proc 200 MHz Pentium Pro server with 512 MB RAM. Other processor configurations (1-proc and 2-proc) have also been tested and any statements pertaining to these configurations are explicitly noted.
For the Microsoft® Site Server 3.0 Publishing service there are two separate measures for determining capacity: Active Server Pages (ASP) performance and document throughput. ASP performance is the primary measure, and is defined here to include ASP throughput (which measures the number of ASP requests processed per second), and ASP latency (which measures the time required to process a single ASP request).
In Chart 1, ASP throughput is compared for 1-proc, 2-proc, and 4-proc server configurations. User load configurations are based on the proportions found in the User Profiles.
Note User profiles are based on a mix of users browsing content, submitting new content, and administering the site. The proportion of users is based on a ratio of 100:10:1. For every 100 users browsing the site, there are 10 users submitting new content, and 1 user administering the site.
Chart 1: ASP throughput for 1-proc, 2-proc, and 4-proc servers
Performance is consistent for all three processor configurations up through 333 users (300+30+3). The 1-proc and 2-proc configurations reach maximum throughput at 444 (400+40+4) users, while the 4-proc configuration reaches maximum throughput at 555 users before performance starts to decline.
In the table below, maximum ASP throughput is compared for each processor configuration. Perfect processor scaling would result in a 100 percent change between processor configurations.
Table 6: Comparing ASP throughput for 1-proc, 2-proc, and 4-proc servers
Processors | Maximum ASP | Percent change |
1-proc | 2.767 | --- |
2-proc | 3.268 | 18.1% |
4-proc | 4.304 | 31.7% |
Chart 2 compares ASP latency for the same three-processor configurations. Note that performance is once again consistent for all three processor configurations up through 333 users. At 444 users, latency is high for the 1-proc configuration (7.461 seconds per ASP request), indicating that capacity has been reached. This chart also shows that capacity is 444 users for a 2-proc configuration and 555 users for a 4-proc configuration.
Chart 2: ASP latency for 1-proc, 2-proc, and 4-proc servers
A scaling table can be created which projects processor requirements based on peak user load, using proportions of users (for example, the first table entry identifies processor requirements for a user profile made up of 20 users browsing content and two users submitting new content) and the frequencies (browse transactions occur once every 2 minutes per user, submit transactions once every 5 minutes per user and admin transactions once every minute per user) of transactions that are identical to the values used for the user profiles. The first table entry shows 20 users browsing content (at one transaction every 2.0 minutes) and 2 users submitting content (one document every 5.0 minutes).
Table 7: CmSample scaling table
Concurrent users | ASP requests/sec | Processor |
20+2 | 0.157 | 1 x 200 |
100+10+1 | 0.820 | 1 x 200 |
200+20+2 | 1.725 | 1 x 200 |
300+30+3 | 2.462 | 1 x 200 |
400+40+4 | 3.268 | 2 x 200 |
500+50+5 | 4.304 | 4 x 200 |
The secondary measure of performance for Publishing is document throughput. When a content provider submits a document to the Publishing server, there are several steps involved before the content is actually published. First, the document is uploaded from the Web client to a staging area on the server. Then the Content Management Index Server indexes the content based on content attributes assigned by the content provider. The Index Server performs content indexing at 30-second intervals. Then, in the final step, the content is posted on the Publishing site for viewing.
Chart 3 compares document throughput for 1-proc, 2-proc, and 4-proc configurations at different user loads. The data was collected using the InetMonitor simulator while monitoring arrival of submitted documents. The sampling time was five minutes (at one-minute intervals).
Note Document submission from a Web browser uses a posting acceptor on the submit.asp page to upload content to the server. Because InetMonitor cannot be used to simulate the transfer of files from client to server, all tested documents are initially placed in the staging area on the server, bypassing the file transfer process altogether. Therefore, actual document throughput (using a Web browser) will be lower than specified.
Chart 3: Scaling for document throughput
Scaling for document throughput shows similar, although slightly better results than for ASP performance. The one distinction is that document throughput continues to rise to 666 users on a 4-proc server, which strengthens the argument for upgrading processors at high user loads (> 333 users).
Table 8: Scaling for document throughput
Processor configuration | Max docs/sec | Maximum user load | Percent change |
1-proc | 0.100 | 300+30+3 | --- |
2-proc | 0.117 | 400+40+4 | 17.0% |
4-proc | 0.177 | 600+60+6 | 51.4% |
In Chart 4, maximum ASP throughput for each CmSample transaction is shown. Throughput is measured using the InetMonitor scripts found in Appendix E, in conjunction with the PerfMon counter ASP requests per second. User load is increased until maximum ASP throughput is achieved. The sampling time was 50 seconds (using 1-second intervals).
Note ASP throughput readings are generated using the InetMonitor simulator. For Publishing service, the default configuration requires NTLM authentication, which involves negotiation between client and server. Actual ASP throughput will be slightly higher. If NTLM authentication is turned off, ASP throughput will also be slightly higher.
Note that produced view pages (for example, awards, case studies, data sheets, and so on) show similar performance (approximately 5.9 requests/sec). Informational pages (for example, company, news products, and so on) all show much higher performance than produced view pages (ranging from 24.9 to 54.2 requests/sec). Produced view pages (as well as approval, submit, and editorial pages) make extensive use of Content Index Service (CISVC), whereas informational pages (in addition to the default page) do not.
Chart 4: ASP throughput for each CmSample transaction type
Key
awd | = | awards | cas | = | casestudy | dat | = | datasheet |
hed | = | headlines | job | = | jobpostings | prs | = | pressrelease |
spc | = | specifications | wp | = | Whitepaper | cmp | = | company |
new | = | news | prd | = | Products | def | = | default |
sub | = | submit | edv | = | editorial view | eda | = | editorial approve |
edd | = | editorial delete | del | = | my delete | prp | = | my edit properties |
Resource utilization and performance measurements can be compared for 1-proc, 2-proc, and 4-proc configurations using the InetMonitor user profile scripts in conjunction with the PerfMon performance monitor. Chart 5 is a comparison of data collected for the low-end user profile (20 users browsing content, and 2 users submitting new content) using each processor configuration. The sampling period is 300 seconds (3-second intervals).
Note InetMonitor User Profile scripts can be found in Appendix E.
Note The PerfMon counters listed in Appendix C should be used to track performance and resource utilization.
Chart 5: CmSample low-end profile test results
Based on these results, it can be seen that the low-end profile does not stress resources. For all three processor configurations, CPU utilization is approximately 20 PPEM and context switching is low (less than 1000/sec), indicating efficient use of CPU. ASP throughput (ASP requests/sec) and ASP latency show consistent performance across processor configurations.
Table 9: CmSample low-end profile test results for 1-proc, 2-proc, and 4-proc servers
Process | 1-proc | 2-proc | 4-proc |
CPU utilization (PPEM) | 20.0 | 15.4 | 18.1 |
ASP requests/sec | 0.157 | 0.183 | 0.156 |
ASP latency (sec) | 0.375 | 0.310 | 0.428 |
Context switches | 503 | 541 | 683 |
% proc CISVC | 0.299 | 0.328 | 0.334 |
Submits/min | 0.353 | 0.842 | 0.364 |
The mid-line profile (100 users browsing content, 10 users submitting new content and 1 user administering the site) test results shown in Chart 6 use a sampling period of 300 seconds (3-second intervals).
Chart 6: CmSample mid-line profile test results
Resource utilization remains low (60-70 PPEM CPU utilization) and ASP performance (ASP throughput and ASP latency) is consistent across processor configurations. Note, however, that for 111 concurrent users, ASP throughput is still less than one ASP request per second. ASP latency is essentially unchanged from the low-end profile measurements.
Table 10: CmSample mid-line profile test results for 1-proc, 2-proc, and 4-proc servers
Process | 1-proc | 2-proc | 4-proc |
CPU utilization (PPEM) | 72.1 | 61.2 | 72.3 |
ASP requests/sec | 0.820 | 0.867 | 0.879 |
ASP latency (sec) | 0.364 | 0.316 | 0.404 |
Context switches | 803 | 1,240 | 1,529 |
% proc CISVC | 1.353 | 1.583 | 1.914 |
Submits/min | 2.250 | 2.000 | 2.000 |
The high-line profile (500 users browsing content, 50 users submitting new content and 5 users administering the site) test results shown in Chart 7 use a sampling period of 1200 seconds (12-second intervals). The longer sampling period was required because of variations in test results.
Chart 7: CmSample high-end profile test results
Test results show that the high-end user profile is beyond the capacity of the 1-proc and 2-proc configurations. For the 1-proc server, CPU utilization is 100 percent (200 PPEM) and ASP latency is 36.495 seconds. For the 2-proc server, CPU utilization is 98.625 percent (394.5 PPEM) and ASP latency is still beyond threshold at 10.669 seconds. On the 4-proc server, CPU utilization is 58.675 percent (469.4 PPEM) and ASP latency is an acceptable 1.189 seconds.
Note that context switching for 2-proc and 4-proc servers is relatively high (> 15,000/sec) which results in a less efficient use of CPU. Because context switches consume CPU cycles, high numbers of context switches significantly impact CPU utilization, reducing the number of cycles available to process ASP requests.
Table 11: CmSample high-end profile test results for 1-proc, 2-proc, and 4-proc servers
Process | 1-proc | 2-proc | 4-proc |
CPU utilization (PPEM) | 200.0 | 394.5 | 469.4 |
ASP requests/sec | 2.509 | 3.055 | 4.304 |
ASP latency (sec) | 36.495 | 10.669 | 1.189 |
Context switches | 3,608 | 19,377 | 18,302 |
% proc CISVC | 4.006 | 8.636 | 9.501 |
Submits/min | 4.000 | 4.800 | 9.140 |
When content providers submit content, it is first indexed before being posted on the Publishing server. During content submission, the content author tags the content with specific attributes (for example, author, title, and so on). The Content Management Index Server then indexes the content based on these attributes. The Index Server performs content indexing at 30-second intervals. The behavior of the Content Index Server impacts performance for posting of content. Content is typically posted 30 to 50 seconds after being submitted by the content author.
Chart 8: Submit and posting times at varying CPU levels for 4-proc server
In Chart 8, the time required to submit and post a document was measured for varying CPU levels to see if higher CPU activity would have an impact on the performance of the Content Index Server. Performance remains fairly constant except at the highest CPU utilization levels. If CPU utilization remains very high (> 95 percent), Content Index service could be impaired resulting in a loss of document throughput. For Content Index service to operate consistently and effectively, CPU utilization on the Publishing server should remain below 90 percent.
Note Time to submit is defined here as the time required for the submit page to re-display on the client browser after the submit button is clicked by the user. Time to submit will also vary based on the size of the document. Larger documents take a longer time period to upload (the submit transaction will not complete until the document is completed uploaded to the server). In this sample case, upload time is not considered to clearly emphasize the impact of CPU utilization on performance.
Chart 9 tracks two separate user profiles. The proportions of users (for example, the first data point compares CPU utilization for 100 browsers and 1 admin, versus 100 browsers, 10 submitters, and 1 admin) and the frequencies (browse transactions occur once every 2 minutes per user, submit transactions once every 5 minutes per user and admin transactions once every minute per user) of transactions are identical to the values used for the user profiles in Chapter 1 except that one profile includes users uploading content and the other does not. This helps show the impact of users uploading content on CPU utilization. The average increase in CPU utilization is 65 percent.
Note The large increases in CPU utilization for both profiles at higher numbers of users. This rapid increase in CPU corresponds to a geometric increase in context switching as shown in Chart 10.
Chart 9: Comparing CPU utilization for two user profiles (4-proc server)
Table 12: Performance and resource utilization for two user profiles (4-proc server)
Profiles | 100+(10)+1 | 200+(20)+2 | 300+(30)+3 | 400+(40)+4 | 500+(50)+5 |
Without submit | 5.827 | 11.479 | 16.154 | 21.161 | 43.527 |
With submit | 9.042 | 15.830 | 26.350 | 49.880 | 58.670 |
Using the same profiles, the percent CPU utilization used for Content Index Service (CISVC) is compared. As expected, the profile that includes users submitting content makes far greater use of CISVC. Note how similar Chart 10 is to Chart 9, suggesting a close correlation between CISVC and total percent CPU utilization.
Chart 10: Comparing Content Index Service for two user profiles (4-proc server)
The same pattern holds for context switching when comparing the two profiles. Large increases in context switching are attributed to rapid increases in CPU utilization because context switches consume CPU cycles.
Chart 11: Comparing context switching for 2 user profiles (4-proc server)
All counters noted can be found in PerfMon. The counters in the ASP and Web objects can be used to capture profile information, as well as usage trends.
Table 13: User profile test data for 1-proc, 2-proc, and 4-proc servers
1-proc | Low-end | Mid-line | High-end | |||
Users | 20+2 | 100+10+1 | 200+20+2 | 300+30+3 | 400+40+4 | 500+50+5 |
%proc | 10.020 | 36.035 | 60.527 | 85.672 | 99.620 | 100.000 |
Req/sec | 0.157 | 0.820 | 1.725 | 2.462 | 2.767 | 2.509 |
Req exec | 0.052 | 0.440 | 1.022 | 2.337 | 7.957 | 9.778 |
Req queued | 0.000 | 0.000 | 0.000 | 0.000 | 7.817 | 112.697 |
Req exec time (ms) | 374 | 364 | 438 | 881 | 4,355 | 4,883 |
Req wait time (ms) | 0.469 | 0.000 | 0.000 | 2.820 | 3,106 | 31,612 |
Context switches | 503 | 803 | 1,171 | 1,550 | 2,092 | 3,608 |
%proc CISVC | 0.299 | 1.353 | 2.450 | 3.249 | 4.260 | 4.006 |
Disk xfers/sec | 0.313 | 1.117 | 1.766 | 2.398 | 2.102 | 2.525 |
Submits /min |
0.353 | 2.250 | 3.750 | 6.000 | 4.000 | 4.000 |
2-proc | Low-end | Mid-line | High-end | |||
Users | 20+2 | 100+10+1 | 200+20+2 | 300+30+3 | 400+40+4 | 500+50+5 |
%proc | 3.848 | 15.300 | 35.460 | 52.583 | 78.131 | 98.623 |
Req/sec | 0.183 | 0.867 | 1.678 | 2.483 | 3.268 | 3.055 |
Req exec | 0.070 | 0.280 | 0.780 | 1.394 | 3.900 | 16.610 |
Req queued | 0.000 | 0.000 | 0.010 | 0.000 | 0.000 | 14.930 |
Req exec time (ms) | 310 | 316 | 420 | 451 | 894 | 5474 |
Req wait time (ms) | 0.000 | 0.160 | 1.100 | 0.636 | 2.060 | 5195.000 |
Context switches | 541 | 1,240 | 2,281 | 3,738 | 11,879 | 19,377 |
%proc CISVC | 0.328 | 1.583 | 4.153 | 4.909 | 8.072 | 8.636 |
Disk xfers/sec | 0.429 | 1.118 | 1.678 | 2.396 | 2.850 | 2.848 |
Submits /min |
0.842 | 2.000 | 4.000 | 7.000 | 7.000 | 4.800 |
4-proc | Low-end | Mid-line | High-end | ||||
Users | 20+2 | 100+10+1 | 200+20+2 | 300+30+3 | 400+40+4 | 500+50+5 | 600+60+6 |
%proc | 2.265 | 9.042 | 15.830 | 26.350 | 49.880 | 58.670 | 92.952 |
Req/sec | 0.156 | 0.879 | 1.772 | 2.590 | 3.389 | 4.304 | 3.495 |
Req exec | 0.000 | 0.350 | 0.596 | 1.110 | 3.380 | 5.890 | 30.677 |
Req queued | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 | 12.623 |
Req exec time (ms) | 428 | 404 | 357 | 382 | 693 | 1188 | 7966 |
Req wait time (ms) | 0.000 | 0.000 | 0.465 | 0.160 | 1.120 | 1.400 | 4,187 |
Context switches | 683 | 1,529 | 2,612 | 5,279 | 14,917 | 18,302 | 34,425 |
%proc CISVC | 0.334 | 1.914 | 2.637 | 4.301 | 8.363 | 9.501 | 10.938 |
Disk xfers/sec | 0.364 | 2.000 | 3.860 | 5.500 | 7.750 | 9.140 | 10.600 |
Submits /min |
0.273 | 1.092 | 1.884 | 2.020 | 2.487 | 3.222 | 3.013 |
Following are the test scripts used to calculate maximum load for transaction in the CmSample site. Scripts are run using the InetMonitor simulator.
Award:
REM +++ Browse Awards section +++
USER Administrator hh1234
GET url:/cmsample/awardview.asp
Casestudy:
REM +++ Browse Case Study section +++
USER Administrator hh1234
GET url:/cmsample/casestudyview.asp
Company:
REM +++ Browse Company View section +++
USER Administrator hh1234
GET url:/cmsample/companyview.asp
Datasheet:
REM +++ Browse Datasheets section +++
USER Administrator hh1234
GET url:/cmsample/datasheetview.asp
Editorial Approve:
REM +++ Editorial Approval +++
USER Administrator hh1234
LOOP 1000
POST SEQULIST(cm_approve.txt)
ENDLOOP
LOOP 100000
SLEEP 1000
ENDLOOPEditorial Delete:
REM +++ Editorial delete +++
USER Administrator hh1234
LOOP 1000
POST SEQULIST(cm_delete.txt)
ENDLOOP
LOOP 100000
SLEEP 1000
ENDLOOP
Editorial View:
REM +++ Editorial View +++
USER Administrator hh1234
GET url:/CmSample/approve.asp
Headlines:
REM +++ Browse Headlines section +++
USER Administrator hh1234
GET url:/cmsample/headlineview.asp
Jobpostings:
REM +++ Browse Job Postings section +++
USER Administrator hh1234
GET url:/cmsample/jobpostingview.asp
My Delete:
REM +++ Delete my documents +++
USER Administrator hh1234
LOOP 1000
POST SEQULIST(cm_mydelete.txt)
ENDLOOP
LOOP 100000
SLEEP 1000
ENDLOOP
My Edit Properties:
REM +++ Edit properties in my documents +++
USER Administrator hh1234
LOOP 1000
GET url:/CmSample/getprops.asp?CMOp=Edit&CMCurrentFile=c%3A%5Cmicrosoft+site+server%5Cdata
%5Cpublishing%5Ccmsample%5Cawards%5Chello%2Etxt%2Estub&CMCategory=Awards
&CMReturnPage=/cmsample/cpview.asp
GET url:/CmSample/awards.asp?CMOp=Edit&CMCurrentFile=c%3A%5Cmicrosoft+site+server%5Cdata
%5Cpublishing%5Ccmsample%5Cawards%5Chello%2Etxt%2Estub&CMCategory=Awards
&CMReturnPage=/cmsample/cpview.asp
GET url:/CmSample/filename.asp?CMOp=Edit&CMCurrentFile=c%3A%5Cmicrosoft+site+server%5Cdata
%5Cpublishing%5Ccmsample%5Cawards%5Chello%2Etxt%2Estub&CMCategory=Awards
&CMReturnPage=/cmsample/cpview.asp
GET url:/CmSample/saveprops.asp?Title=hello+there&Abstract=this+is+a+test%09++
&Topic=TT%3A%5CSampleSite%5CProducts%5CMoonWalker%7CMoon+Walker
&ContentAuthor=John+Smith&PresentedBy=John+Smith
&relatedurl=http%3A%2F%2Fwww.somewhere.com&CMCurrentFile=C%3A%5CMicrosoft+Site+Server
%5Cdata%5CPublishing%5CCmSample%5CUpload%5Chello2.txt&CMCategory=Awards
&CMReturnPage=submit.asp&CMOp=&CMRemainingList=&CMParamNames=&CMParamValues=
&CMForcePublish=
GET url:/CmSample/cpview.asp
ENDLOOP
LOOP 100000
SLEEP 1000
ENDLOOP
News:
REM +++ Browse News section +++
USER Administrator hh1234
GET url:/cmsample/news.asp
Press release:
REM +++ Browse Press Release section +++
USER Administrator hh1234
GET url:/cmsample/pressreleaseview.asp
Products:
REM +++ Browse Products section +++
USER Administrator hh1234
GET url:/cmsample/products.asp
Specifications:
REM +++ Browse Specifications section +++
USER Administrator hh1234
GET url:/cmsample/specificationview.asp
Submit:
REM +++ Submit docs to Headlines section +++
USER Administrator hh1234
LOOP 1000
GET SEQULIST(cm_type1.txt)
GET SEQULIST(cm_type2.txt)
GET SEQULIST(cm_getprop.txt)
GET SEQULIST(cm_headline.txt)
GET SEQULIST(cm_filename.txt)
GET SEQULIST(cm_saveprop.txt)
GET url:/CmSample/submit.asp
ENDLOOP
LOOP 100000
SLEEP 1000
ENDLOOP
View document:
REM +++ View document in awards section +++
USER Administrator hh1234
GET url:/CmSample/awards/hello.txt
White paper:
REM +++ Browse Whitepaper section +++
USER Administrator hh1234
GET url:/cmsample/whitepaperview.asp
Welcome:
REM +++ Home page +++
USER Administrator hh1234
GET url:/cmsample/default.asp
The following information is used by the CmSample transactions. Each section is a list that is accessed sequentially. Each line is identical except that the file name is incremented (from 1 to 1,000).
Cm_approve.txt:
url:/CmSample/Files.asp?PROPERTIES:Op=Approve&ReturnPage=%2FCmSample%2Fapprove.asp
&FilenameList=Headlines%7Cc%3A%5Cmicrosoft+site+server%5Cdata%5Cpublishing%5Ccmsample
%5Capprove%5Ctest001.txt
url:/CmSample/Files.asp?PROPERTIES:Op=Approve&ReturnPage=%2FCmSample%2Fapprove.asp
&FilenameList=Headlines%7Cc%3A%5Cmicrosoft+site+server%5Cdata%5Cpublishing%5Ccmsample
%5Capprove%5Ctest002.txt
url:/CmSample/Files.asp?PROPERTIES:Op=Approve&ReturnPage=%2FCmSample%2Fapprove.asp
&FilenameList=Headlines%7Cc%3A%5Cmicrosoft+site+server%5Cdata%5Cpublishing%5Ccmsample
%5Capprove%5Ctest003.txt
. . .
Cm_delete.txt:
url:/CmSample/Files.asp?PROPERTIES:Op=Delete&ReturnPage=%2Fcmsample%2Fapprove.asp&FilenameList=
Headlines%7Cc%3A%5Cmicrosoft+site+server%5Cdata%5Cpublishing%5Ccmsample%5Capprove
%5Ctest001.txt
url:/CmSample/Files.asp?PROPERTIES:Op=Delete&ReturnPage=%2Fcmsample%2Fapprove.asp&FilenameList=
Headlines%7Cc%3A%5Cmicrosoft+site+server%5Cdata%5Cpublishing%5Ccmsample%5Capprove
%5Ctest002.txt
url:/CmSample/Files.asp?PROPERTIES:Op=Delete&ReturnPage=%2Fcmsample%2Fapprove.asp&FilenameList=
Headlines%7Cc%3A%5Cmicrosoft+site+server%5Cdata%5Cpublishing%5Ccmsample%5Capprove
%5Ctest003.txt
. . .
Cm_filename.txt:
url:/CmSample/FileName.asp?CMReturnPage=submit.asp&CMCategory=Headlines&CMCurrentFile=test001%2Etxt
url:/CmSample/FileName.asp?CMReturnPage=submit.asp&CMCategory=Headlines&CMCurrentFile=test002%2Etxt
url:/CmSample/FileName.asp?CMReturnPage=submit.asp&CMCategory=Headlines&CMCurrentFile=test003%2Etxt
. . .
Cm_getprop.txt:
url:/CmSample/GetProps.asp?CMReturnPage=submit.asp&CMCategory=Headlines&CMCurrentFile=test001%2Etxt
url:/CmSample/GetProps.asp?CMReturnPage=submit.asp&CMCategory=Headlines&CMCurrentFile=test002%2Etxt
url:/CmSample/GetProps.asp?CMReturnPage=submit.asp&CMCategory=Headlines&CMCurrentFile=test003%2Etxt
. . .
Cm_headline.txt:
url:/CmSample/Headlines.asp?CMReturnPage=submit.asp&CMCategory=Headlines&CMCurrentFile=test001%2Etxt
url:/CmSample/Headlines.asp?CMReturnPage=submit.asp&CMCategory=Headlines&CMCurrentFile=test002%2Etxt
url:/CmSample/Headlines.asp?CMReturnPage=submit.asp&CMCategory=Headlines&CMCurrentFile=test003%2Etxt
. . .
Cm_mydelete.txt:
url:/CmSample/Files.asp?PROPERTIES:Op=Delete&ReturnPage=%2FCmSample%2Fcpview.asp&FilenameList=
Awards%7Cc%3A%5Cmicrosoft+site+server%5Cdata%5Cpublishing%5Ccmsample%5Cawards
%5Ctest001.txt
url:/CmSample/Files.asp?PROPERTIES:Op=Delete&ReturnPage=%2FCmSample%2Fcpview.asp&FilenameList=
Awards%7Cc%3A%5Cmicrosoft+site+server%5Cdata%5Cpublishing%5Ccmsample%5Cawards
%5Ctest002.txt
url:/CmSample/Files.asp?PROPERTIES:Op=Delete&ReturnPage=%2FCmSample%2Fcpview.asp&FilenameList=
Awards%7Cc%3A%5Cmicrosoft+site+server%5Cdata%5Cpublishing%5Ccmsample%5Cawards
%5Ctest003.txt
. . .
Cm_saveprop.txt:
url:/CmSample/SaveProps.asp?Description=New+stuff&Abstract=Lots+of+new+stuff&Topic=Widget+Maker
&ContentAuthor=John+Smith&Editor=John+Smith&CMCurrentFile=C%3A%5CMicrosoft+Site+Server%5Cdata
%5CPublishing%5CCmSample%5CUpload%5Ctest001.txt&CMCategory=Headlines&CMReturnPage=
submit.asp&CMOp=&CMRemainingList=&CMParamNames=&CMParamValues=&CMForcePublish=
url:/CmSample/SaveProps.asp?Description=New+stuff&Abstract=Lots+of+new+stuff&Topic=Widget+Maker
&ContentAuthor=John+Smith&Editor=John+Smith&CMCurrentFile=C%3A%5CMicrosoft+Site+Server%5Cdata
%5CPublishing%5CCmSample%5CUpload%5Ctest002.txt&CMCategory=Headlines&CMReturnPage=
submit.asp&CMOp=&CMRemainingList=&CMParamNames=&CMParamValues=&CMForcePublish=
url:/CmSample/SaveProps.asp?Description=New+stuff&Abstract=Lots+of+new+stuff&Topic=Widget+Maker
&ContentAuthor=John+Smith&Editor=John+Smith&CMCurrentFile=C%3A%5CMicrosoft+Site+Server%5Cdata
%5CPublishing%5CCmSample%5CUpload%5Ctest003.txt&CMCategory=Headlines&CMReturnPage=
submit.asp&CMOp=&CMRemainingList=&CMParamNames=&CMParamValues=&CMForcePublish=
. . .
Cm_type1.txt:
url:/CmSample/Type.asp?CMCategory=&CMReturnPage=submit%2Easp&CMForcePublish=&CMFilename=
C:\InetMonitor\cm\files\test001.txt
url:/CmSample/Type.asp?CMCategory=&CMReturnPage=submit%2Easp&CMForcePublish=&CMFilename=
C:\InetMonitor\cm\files\test002.txt
url:/CmSample/Type.asp?CMCategory=&CMReturnPage=submit%2Easp&CMForcePublish=&CMFilename=
C:\InetMonitor\cm\files\test003.txt
. . .
Cm_type2.txt:
url:/CmSample/Type.asp?Op=&CMCategory=Headlines&CMCurrentFile=test001.txt&CMReturnPage=submit.asp
url:/CmSample/Type.asp?Op=&CMCategory=Headlines&CMCurrentFile=test002.txt&CMReturnPage=submit.asp
url:/CmSample/Type.asp?Op=&CMCategory=Headlines&CMCurrentFile=test003.txt&CMReturnPage=submit.asp
. . .
The following scripts are used to test the three user profiles discussed in Chapter 1.
Scenario Admin:
REM +++ SiteServer Publishing: CmSample Admin +++
USER Administrator hh1234
GET url:/SiteServer/Admin/Publishing/Default.asp
GET url:/SiteServer/Admin/Titlebar.asp
SLEEP RANDNUMBER(30000,90000)
GET url:/SiteServer/Admin/Publishing/Menu.asp
SLEEP RANDNUMBER(30000,90000)
GET url:/SiteServer/Admin/Publishing/GetStarted.asp
SLEEP RANDNUMBER(30000,90000)
GET url:/SiteServer/Admin/Publishing/SvrProp.asp
SLEEP RANDNUMBER(30000,90000)
GET url:/SiteServer/Admin/Publishing/SvrLog.asp?crsServerName=pt022
SLEEP RANDNUMBER(30000,90000)
Scenario Browse:
REM +++ SiteServer Publishing: CmSample Browse +++
USER Administrator hh1234
GET url:/cmsample/default.asp
SLEEP RANDNUMBER(60000,180000)
GET url:/cmsample/companyview.asp
SLEEP RANDNUMBER(60000,180000)
GET url:/cmsample/datasheetview.asp
SLEEP RANDNUMBER(60000,180000)
GET url:/CmSample/datasheets/widgetoverview.doc
SLEEP RANDNUMBER(60000,180000)
GET url:/cmsample/headlineview.asp
SLEEP RANDNUMBER(60000,180000)
GET url:/cmsample/jobpostingview.asp
SLEEP RANDNUMBER(60000,180000)
GET url:/CmSample/jobpostings/engineer.doc
SLEEP RANDNUMBER(60000,180000)
GET url:/CmSample/specificationview.asp
SLEEP RANDNUMBER(60000,180000)
GET url:/CmSample/specifications/architecture.doc
SLEEP RANDNUMBER(60000,180000)
GET url:/cmsample/whitepaperview.asp
SLEEP RANDNUMBER(60000,180000)
Scenario Submit:
REM +++ SiteServer Publishing: CmSample Submit +++
USER Administrator hh1234
SLEEP RANDNUMBER(30000,90000)
GET url:/cmsample/submit.asp
SLEEP RANDNUMBER(30000,90000)
GET SEQULIST(cm_type1.txt)
SLEEP RANDNUMBER(30000,90000)
GET SEQULIST(cm_type2.txt)
GET SEQULIST(cm_getprop.txt)
SLEEP RANDNUMBER(30000,90000)
GET SEQULIST(cm_headline.txt)
GET SEQULIST(cm_filename.txt)
GET SEQULIST(cm_saveprop.txt)
GET url:/CmSample/submit.asp
SLEEP RANDNUMBER(30000,90000)
Information in this document, including URL and other Internet web site references, is subject to change without notice. The entire risk of the use or the results of the use of this resource kit remains with the user. This resource kit is not supported and is provided as is without warranty of any kind, either express or implied. The example companies, organizations, products, people and events depicted herein are fictitious. No association with any real company, organization, product, person or event is intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.
© 1999-2000 Microsoft Corporation. All rights reserved.
Microsoft, ActiveX, Windows and Windows NT are either registered trademarks or trademarks of Microsoft Corporation in the U.S.A. and/or other countries/regions.
The names of actual companies and products mentioned herein may be the trademarks of their respective owners.