Current iteration dedicated to clean-up and performance improvements. We'd released TP 2.1 and TP 2.2 so it is time to refactor some areas as well as improve performance.
The main thing we've done is removing Ayende Generics. With other quite specific improvements this gave significant performance boost.
Test Configuration
Test configuration is not enterprise, but rather for small/average company setup. Database and Application are on the same server which is BAD thing for large concurrent users set. In fact if your target is about 100 concurrent users in TargetProcess you should setup database and application on different servers.
Tests themselves are kinda extreme. There are three user types (Developer, QA and Project Manager). QA browse bugs and adds bugs, Developer adds comments and time and browse user stories and ToDo, PM browse different reports. There are about 20% of PM, 40% of QA and 40% of developers in test users population.
Test duration is 10 minutes and during that time about 350 bugs and 400 time records added into database (quite many as you can imagine).
Server configuration: CPU AMD Athlon 64 X2 3800+, RAM 2 GB
Results
Chart below shows average page time (sec) for TP 2.1, TP 2.2 and TP 2.3
This chart shows greater scalability of TP 2.3. Response time for 10, 20 and 30 users in TP 2.3 almost the same.
TP 2.2 or TP 2.1 did not work for 50 or 100 concurrent users (cpu usage is 100% and server almost can't process requests). TP 2.3 runs with 50 and 100 concurrent users without much problems.
CPU usage is quite good (if you bear in mind that database and IIS on the same server).
Also we've measured average CPU usage on application server when database and IIS on separate servers and it is about 50% for 100 concurrent users.