The metric published for this test (transactions per second) established a baseline measurement, a necessary task for any performance evaluation. But this metric gave PT application testers no way of knowing where transactions occurred. Some remained on client (load-generating) computers only, others reached between that computer and IIS on the server computer, and still others spanned the entire data path to the database on the server computer. In other words, this information contained too few details to let developers improve performance of the PT application.
In the future, testers hope to measure performance by tier or even at specific points on the data path. For the PT application, this path involves the following technologies, starting at the client computer and ending with the server computer: HTTP, JavaScript, XML, XML Data Source Object (XMLDSO), IIS and ASP, CSS, RDS (part of ADO), and Microsoft SQL Server.
The first goal of future testing would thus be to identify technologies that consume the most time while processing transactions. This time may not indicate a bottleneck; it may be simply be data transfer from SQL Server, for example. In this case, developers might decide to optimize data requests or increase the SQL Server data cache. Doing this may minimize or prevent what could become bottlenecks at higher workloads. Another possible technique might be to physically separate SQL Server and IIS.
Currently, most test tools cannot pinpoint the location of bottlenecks along this path, although some may soon have that ability.
The PT application will likely include a User application to accompany the Admin application. Although both will have similar design, they will not be identical (the User application will probably include role-based security), so future tests should be able to contrast the performance of each.