How We Conducted the Tests
We conducted the benchmark tests for Visual Studio using a Pentium II/400 with 128 MB of RAM.

We installed two copies (dual boot) of Windows NT Server 4.0 with Service Pack 3. On the first copy of NT, we installed Visual Studio 5.0 and VS SP3; on the other, we installed Visual Studio 6.0 (release version). We used the GetTickCount API to time each benchmark. Any preparatory work needed for a benchmark was not included in the time for that benchmark. We tried to make the tests fair by disabling all services that would normally start automatically (such as IIS) to lower the amount of background CPU usage. We then ran each of our tests 10 times and used the average of these 10 tests as the basis for the benchmarks presented here. We did this to help mitigate the effects of background CPU usage.

We also used a Pentium II/400 with 128 MB of RAM and SQL Server 7.0 (beta) for the database server. To reduce the effects of caching performed by data access drivers, we ran database tests once after a fresh reboot of the client machine. For example, running the DAO test for the first time required around 100 milliseconds to establish a connection. Subsequent runs required almost no time to establish a connection. Caching by data access drivers might be beneficial to some real-world applications, but it doesn't necessarily test the tools' performance with respect to one another, and it would skew the results of the average because you wouldn't be testing the same thing every time.