Getting at Requests Per Second (RPS)

Part 3 of the Demystifying SharePoint Performance Management series

Paul Culmsee

by Paul Culmsee on 11/26/2014

Share this:
Print

Article Details

Date Revised:
11/26/2014

Applies to:
Glyma, Paul Culmsee, performance, RPS, SevenSigma, SharePoint


Hi and welcome back to this series aimed at making SharePoint performance management a little more digestible. In my first article in this series, we examined the difference between lead and lag indicators and in my second article, we specifically looked at the lead indicator of Requests Per Seconds (RPS) and its various opportunities and issues. In this article we are actually going to do some real work at the – wait for it – the command line! As a result, the collective heart rates of my business-oriented readers – who are avid users of the “I’m business, not technical” cliché  – will start to rise since anything that involves a command line is shrouded in mystique, fear, uncertainty and doubt.

For the tech types reading this article, please excuse the verboseness of what I go through here. I need to keep the business types from freaking out.

Okay… so in the last article I said that despite its issues in terms of being a reliable indicator of future performance needs, RPS has the advantage that it can be derived from your existing deployment. This is because the information needed is captured in web server (IIS) logs over time. Having this past performance means you have a lag indicator view of RPS, which can be used as a basis to understand what the future might look like with more confidence than some arbitrary “must handle x RPS.”

Now just because RPS is held inside web server log files, that does not mean it is easy to get to. In this post, I will outline the three steps needed to manipulate log files to extract that precious RPS goodness. The utility that we are going to use to do this is Log Parser.

Now a warning here: This article assumes your existing deployment runs on Microsoft’s IIS platform v7 (the web server platform that underpins SharePoint 2010) or newer. If you are running one of the myriad of portal/intranet platforms, you are going to have to take this post as a guide and adjust to your circumstances.

Step 1: Getting Log Parser

Installing Log Parser is easy. Just install version 2.2 as you would any other tool. It will run on pretty much any Windows operating system. Once installed, it will likely reside in the C:\Program Files (x86)\Log Parser 2.2 folder. (Or C:\Program Files\Log Parser 2.2 if you have an older, 32-bit PC).

There you go business types – that wasn’t so hard, was it?

Step 2: Getting your web server logs

After the relative ease of getting Log Parser installed, we now need the logs themselves to play with. We are certainly not going to mess with a production system so we will need to copy the log files for your current portal to the PC where you installed Log Parser. If you do not have access to these log files, call your friendly neighborhood systems administrator to get them for you. If you have access (or do not have a friendly neighborhood systems administrator), then you will need to locate the files you need. Here’s how:

Assuming you have access to your web front end server/s, you can load Internet Information Services (IIS) Manager from Start->All programs->Administrative tools on the server. Using this tool we can find out the location of the IIS log files as well as the specific logs we need. By default, IIS logs are stored in C:\inetpub\logs\LogFiles, but it is common for this location to be changed to somewhere else. To confirm this in IIS Manager, click on the server name in the left pane, then click on the “Logging” icon in the right pane. In the example below, we can see that the IIS log files live in G:\LOGS\IIS folder (I always move the log files off C:\ as a matter of principle). While you are there, pay special attention to the fairly nondescript “Use local time for file naming and rollover” tick box. We are going to return to that later…

Click Logging to determine the location of log files

The directory of the IIS log files

Okay so we know where the log files live, so lets work out the subfolder for the specific site. Back in the left hand pane now, expand “Sites” and find the web site you want to profile for RPS. When you have found it, select it and find the “Advanced Settings” link and click it.

Click Advanced Settings to find the web site you want to profile for RPS

On the next screen you will see ID of the site. It will be a large number – something like 1045333159. Take a note of this ID, because all IIS logs for this site will be stored in a folder with the name “W3SVC” prepended to this ID (e.g., W3SVC1045333159). Thus the folder we are looking for is G:\LOGS\IIS\W3SVC1045333159. 

Copy the contents of this folder to the computer where you have installed Log Parser. (In my example below, I copied the logs to E:\LOGS\IIS_WWW\W3SVC1045333159 on a test server.)

The ID helps you figure out which folder IIS logs for this site are stored in

Use a folder on your local machine to store the IIS log files for the site you want to analyze

Step 3: Preparation of log files…

Now we have our log files copied to our PC so we can start doing some Log Parser magic. Unfortunately, default IIS log file format does not make RPS reporting particularly easy and we have to process the raw logs to make a file that is easier to use. Now business people – stay with me here… the payoff is worth the command line pain you are about to endure! :-)

First up, we will make use of the excellent work of Mike Wise (you can find his original document here), who created a script for Log Parser that processes all of the log files and creates a single (potentially very large) file that:

  • Includes a new field which is the time of the day converted into seconds.
  • Splits the date and timestamp up into individual bits (day, month, hour, minute, etc.). This makes it easier to do consolidated reports.
  • Excludes 401 authentication requests. (Way back in part 1 I noted that Microsoft excludes authentication traffic from RPS.)

I have pasted a modified version of Mike’s Log Parser script below, but before you go and copy it into Notepad, make sure you check two really important things.

  1. Be sure to change the path in the second last line of the script to the folder where you copied your IIS logs to. In my case it was E:\LOGS\IIS_WWW\W3SVC1045333159\*.log.
  2. Check whether IIS is saving your logfiles using UTC timestamps or local timestamps. (Now you know why I told you to specifically make note of the “Use local time for file naming and rollover” tickbox earlier). If the box is unticked, the logs are in UTC time and you should use the first script pasted below. If it is ticked, the logs are in local time and the second script should be used.

UTC Script

select EXTRACT_FILENAME(LogFilename),LogRow, date, time, cs-method, cs-uri-stem, cs-username,
c-ip, cs(
User-Agent), cs-host, sc-status, sc-substatus, sc-bytes, cs-bytes, time-taken,

add(
   
add(
         mul(3600,to_int(to_string(to_localtime(to_timestamp(
date,time)),'hh'))),
         mul(60,to_int(to_string(to_localtime(to_timestamp(
date,time)),'mm')))
    ),
    to_int(to_string(to_localtime(to_timestamp(
date,time)),'ss'))
)
as secs,

add(
    mul(60,to_int(to_string(to_localtime(to_timestamp(
date,time)),'hh'))),
    to_int(to_string(to_localtime(to_timestamp(
date,time)),'mm'))
)
as minu,

to_int(to_string(to_localtime(to_timestamp(date,time)),'yy')) as yy,
to_int(to_string(to_localtime(to_timestamp(
date,time)),'MM')) as mo,
to_int(to_string(to_localtime(to_timestamp(
date,time)),'dd')) as dd,
to_int(to_string(to_localtime(to_timestamp(
date,time)),'hh')) as hh,
to_int(to_string(to_localtime(to_timestamp(
date,time)),'mm')) as mi,
to_int(to_string(to_localtime(to_timestamp(
date,time)),'ss')) as ss,
to_lowercase(EXTRACT_PATH(cs-uri-stem))
as fpath,
to_lowercase(EXTRACT_FILENAME(cs-uri-stem))
as fname,
to_lowercase(EXTRACT_EXTENSION(cs-uri-stem))
as fext

from e:\logs\iis_www\W3SVC1045333159\*.log

where sc-status<>401 and date BETWEEN TO_TIMESTAMP(%startdate%, 'yyyy-MM-dd') and TO_TIMESTAMP(%enddate%, 'yyyy-MM-dd')

Local Time Script

select EXTRACT_FILENAME(LogFilename),LogRow, date, time, cs-method, cs-uri-stem, cs-username,
c-ip, cs(
User-Agent), cs-host, sc-status, sc-substatus, sc-bytes, cs-bytes, time-taken,

add(
  
add(
      mul(3600,to_int(to_string(to_timestamp(
date,time),'hh'))),
      mul(60,to_int(to_string(to_timestamp(
date,time),'mm')))
   ),
   to_int(to_string(to_timestamp(
date,time),'ss'))
)
as secs,

add(
   mul(60,to_int(to_string(to_timestamp(
date,time),'hh'))),
   to_int(to_string(to_timestamp(
date,time),'mm'))
)
as minu,

to_int(to_string(to_timestamp(date,time),'yy')) as yy,
to_int(to_string(to_timestamp(
date,time),'MM')) as mo,
to_int(to_string(to_timestamp(
date,time),'dd')) as dd,
to_int(to_string(to_timestamp(
date,time),'hh')) as hh,
to_int(to_string(to_timestamp(
date,time),'mm')) as mi,
to_int(to_string(to_timestamp(
date,time),'ss')) as ss,
to_lowercase(EXTRACT_PATH(cs-uri-stem))
as fpath,
to_lowercase(EXTRACT_FILENAME(cs-uri-stem))
as fname,
to_lowercase(EXTRACT_EXTENSION(cs-uri-stem))
as fext

from e:\logs\iis_www\W3SVC1045333159\*.log

where sc-status<>401 and date BETWEEN TO_TIMESTAMP(%startdate%, 'yyyy-MM-dd') and TO_TIMESTAMP(%enddate%, 'yyyy-MM-dd')

After choosing the appropriate script and modifying the second last line, save this file into the Log Parser installation folder and call it GETSECONDS.TXT.

For the three readers who *really* want to know, the key part of what this does is to take the timestamp of each log entry and turn it into what second of the day it is and what minute of the day it is. So assuming the timestamp is 8:35am at the 34 second mark, the formula effectively adds together:

  • 8 * 3600 (since there are 3600 seconds in an hour)
  • 35 * 60 (60 seconds in a minute)
  • 34 seconds
    = 30934 seconds
  • 8 * 60 (60 minutes in an hour)
  • 35 minutes
    = 515 minutes

Now that we have our GETSECONDS.TXT script ready, let’s use Log Parser to generate the file that we will use for reporting. Open a command prompt (for later versions of Windows, make sure it is an administrator command prompt) and change directory to the Log Parser installation location.

C:\Program Files (x86)\Log Parser 2.2>

Now decide a date to report on. In my example, the logs go back two years and I only want 15 November 2011. The format for the dates MUST be “yyyy-mm-dd” (for example, 2011-11-15).

Type in the following command (substituting whatever date range interests you):

logparser -i:IISW3C file:GetSeconds.txt?startdate='2011-11-15'+enddate='2011-11-15' -o:csv >e:\temp\LogWithSeconds.csv
  • The –i parameter specifies the type of input file. In this case, the input file is IISW3C (IIS weblog format).
  • The ?startdate parameter specifies the start date you want to process.
  • The +enddate parameter specifies the end date you want to process.
  • The –o parameter specifies the type of output file. In this case, the output file is CSV format.
  • The >LogWithSeconds.csv says to save the CSV output into a file called LogsWithSeconds.csv

So, depending on how many log files you had in your logs folder, things may take a while to process. Be patient here… after all, it might be processing years of log files (and now you know why we didn’t do this in a production install!). Also be warned, the resulting LogWithSeconds.csv that is created will be very very big if you specified a wide date range. Whatever you do, do not open this file with Notepad if its large! In the next article I'll show you additional Log Parser scripts to interrogate this file.

Conclusion

Right! If you got this far and your normally not a command line kind of person… well done! If you are a developer, thanks for sticking with me. You should have a newly minted file called LogWithSeconds.csv and you are ready to do some interrogation of it. In the next article, I will outline some more Log Parser scripts that generate some useful information!



Topic: Administration and Infrastructure

Sign in with

Or register