Following a load test, I often need to perform some additional analysis on the web server logs. It is not practical to use any commercial tools, and the free tools are all aimed at people who want graphs and statistics for entire weeks or months, rather than a few hours of high load. So far, I have usually been forced into a roll-your-own solution.
While it is easy to create graphs with Microsoft Excel, it’s 65536 row limit makes analysing any sort of non-trivial load futile unless the log files have been filtered before importing them. Even with filtering, it is hard to do anything useful with such a small number of records.
Microsoft’s Log Parser tool allows you to perform SQL queries on web log files. If you know exactly what you want, this is an extremely powerful tool. Having a large amount of memory will dramatically improve query times for large files, as will truncating the log files to just the period you are interested in. I found that debugging my queries could be a little painful.
The industrial strength solution is always to just dump the log files into a database and analyse them there.
Comments are closed.
maybe you’ll also want to try:
Thanks for your comment.
Webalizer is one of the pieces of software I was referring to when I said “…and the free tools are all aimed at people who want graphs and statistics for entire weeks or months, rather than a few hours of high load”.
Webalizer can be configured to display an hourly report, but it shows statistics for the entire hour rather than the hits for every minute within the hour.
I use webalizer to analyse the traffic to my own website, but it is not the right tool for someone who is doing a load test.