Recommended Workloads
Given the evolutionary nature of the Internet and Web servers, it is hard to determine the exact workload. There are some quantitative studies that evaluate the load and traffic on servers. It is hard to emulate exact scenarios of the workload, but you can use the Web Capacity Analysis Tool to emulate repetitious page fetches in random and distributed fashion.
It is wise to model workloads on data from many different real-world Web servers, including HTTP proxy servers used by corporations. It is more logical to gather this information from access servers, because the information will represent the client end of access to HTTP servers. You should determine the workload mix based on your server’s anticipated use.
The following is rough outline of a recommended workload:
-
A series of page transfers of different file sizes ranging from 512 bytes through 1M.
-
A weighted mix of page transfers with file sizes from 512 through 1M bytes, in single and multiple directories. The workload can vary in the amount of the data set present, ranging from 0.5MB to 200MB.
-
A mix of pages with files and ISAPI transfers
-
A mix of pages with files and CGI transfers
-
A mix of pages with files, ISAPI, and CGI transfers.
-
ISAPI with ODBC and database integration.
-
SSL and Personal Communications Technology (PCT) encrypted data transfer using above combinations.
-
Persistent connection (HTTP keep-alive) page transfers using above combinations.