Managing the Migration Process |
When the detailed design is complete, you’re ready to validate it. Validating the design consists of performing unit, integration, and application testing to ensure the proposed system functions as expected and required. This testing is usually conducted in a lab environment similar to the one depicted in Assess Resource Needs earlier in this section.
Testing can be divided into four basic categories:
The testing process is iterative rather than linear. For example, you might discover a problem during an integration test, correct it, test the solution, and then go back to unit testing to find out if the correction has changed your results. In addition, it’s frequently possible to conduct two or more procedures concurrently in order to complete testing at an earlier date.
For more information about testing and debugging applications, see Developing Web Applications in this book. In addition, see test planning information at http://www.microsoft.com/msf/.
For more information about improving server performance, see Monitoring and Tuning Your Server in this book. To develop your own test plans, you can use the “Sample Test Plan” found in the Testplan.doc file on the Resource Kit companion CD.
Testing at microsoft.com
The microsoft.com Web site provides a good example of the testing process for new servers and applications. Prior to adding a new IIS 5.0 production Web server, the microsoft.com team monitors and tunes a replica of the new server in a lab environment for a period of three months. Team members use a software and hardware configuration exactly replicating the production Web server that will exist on the microsoft.com Web site. Next, they move the server to the Microsoft intranet for a two-week period of further evaluation and tuning. After the tuning period, they replicate it to a nonmission-critical server on microsoft.com, running it in debug mode for one or two days to resolve any new problems that arise. Finally, they replicate the server to the actual production hardware, and then make it available site-wide. During the entire test process, the team monitors server performance continuously by using the HTTP Monitoring Tool, and uses the optimization techniques described in a white paper you can read at http://www.microsoft.com/ms.htm. Another important concern for the microsoft.com team is ASP performance. To ensure good user experiences, the test team inspects all new ASP pages before they’re published on the Microsoft Web site, noting the number of elements within each ASP page that have the potential to block performance, including objects such as Microsoft® ActiveX® Data Objects (ADO) and the ASP FileSystemObject. The test team then uses WCAT to test all pages that contain more than a certain threshold number of potentially troublesome elements. The team scores each page on Requests per Second, Maximum Response Time, and Average Response Time. If a page receives a poor score, the test team requires the owner to make improvements that will enhance performance. |