Analyse Logs using MS Log Parser

With iShare it can be useful to extract information from the IIS logs to count visitor numbers to a particular page and so on. Microsoft's Log Parser is an excellent choice for this kind of thing.

MS Log Parser may be installed using the Astun 3rd Party installer or downloaded from the Microsoft web site:

Log parser is a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows operating system such as the Event Log, the Registry, the file system, and Active Directory.”

Basically, Log Parser allows you to query a set of files as though they were a database table, using SQL queries.

Following are a few examples from Log Parser itself to show something of the variety of things that it can do.

View the Entire list

Use the following to view the entire log:

logparser -h EXAMPLES):

Create a Chart

To create a chart containing the TOP 20 URL's in the "" web site (assumed to be logging in the W3C log format):

LogParser "SELECT TOP 20 cs-uri-stem, COUNT(*) AS Hits INTO MyChart.gif FROM
 <> GROUP BY cs-uri-stem ORDER BY Hits DESC"
 -chartType:Column3D -groupSize:1024x768

The charting output requires a copy of Microsoft Office to be installed locally.

Get Unique Visitors

Here we have created two queries - the first extracts the information into a temporary file and next groups and sorts the output.

User names and Date
C:\Program Files (x86)\Log Parser 2.2>logparser -i: W3C -o:w3c "SELECT DISTINCT date, cs-username INTO tmp.txt FROM 'D:\Essex-LogFiles\Web-LogFiles\LogFiles\W3SVC1\*.log' WHERE cs-username <> NULL"
Group and count by date
C:\Program Files (x86)\Log Parser 2.2>logparser -i: W3C -o:w3c "SELECT date, count(cs-username) as uniquevisitors into unique.txt FROM tmp.txt group by date"

Print the 10 largest files on the C: drive

LogParser "SELECT TOP 10 * FROM C:\*.* ORDER BY Size DESC" -i:FS

Create an XML Report

To create an XML report file containing logon account names and dates from the Security Event Log messages:

LogParser "SELECT TimeGenerated AS LogonDate, EXTRACT_TOKEN(Strings, 0, '|')
 AS Account INTO Report.xml FROM Security WHERE EventID NOT IN (541;542;543)
 AND EventType = 8 AND EventCategory = 2"

Send Error entries to a SYSLOG server

To send error entries in the IIS log to a SYSLOG server you could use the following:

LogParser "SELECT TO_TIMESTAMP(date,time), CASE sc-status WHEN 500 THEN
 'emerg' ELSE 'err' END AS MySeverity, s-computername AS MyHostname,
 cs-uri-stem INTO @myserver FROM <1> WHERE sc-status >= 400" -o:SYSLOG
 -severity:$MySeverity -hostName:$MyHostname

List fields extracted from a CSV file

 LogParser -h -i:CSV myfile.csv -headerRow:on

Practical Examples

Obviously as we are principally interested in extracting information about website visits we will probably be using -i IISW3C to specify the input format (the default type for IIS6).

For more information on the format, see

See below for a couple of practical examples. Also see the command line (logparser -h) for more help and the following URLs for more Log Parser information:

List Authenticated users

To create a CSV file listing all users logging on via a particular page:

LogParser Syntax:
"c:\program files\log parser 2.2\logparser" -i IISW3C "SELECT TO_UPPERCASE(cs-username) 
AS User, COUNT(cs-username) AS Visits INTO user.csv 
FROM C:\Windows\System32\LogFiles\W3SVC1\*.log 
WHERE cs-uri-stem = '/isharemaps/getsecurity.aspx' 
GROUP BY TO_UPPERCASE(cs-username) HAVING (Visits > 0)" -o:CSV 


Here's an explanation of the statement used to do the data extraction:

SELECT TO_UPPERCASE(cs-username) AS User, 
COUNT(cs-username) AS Visits 

Sets the columns to extract. TO_UPPERCASE is a Log Parser function that is pretty self-explanatory.

COUNT is an SQL aggregate function, requires GROUP BY (see below).

We use AS to change the name of the output column to something clearer.

INTO user.csv 

Log Parser allows you to specify a filepath as the SQL output…

FROM C:\Windows\System32\LogFiles\W3SVC1\*.log 

…and as the input.. This is the path to where the log files are stored. Please see the topic on how to Find IIS Log Files for details.

WHERE cs-uri-stem = '/isharemaps/getsecurity.aspx' 

WHERE filters the rows returned from the input 'table'.

cs-uri-stem is the URL without the host address or querystring (anything after the ?).


GROUP_BY creates the set that the COUNT uses.

HAVING (Visits > 0)

HAVING is like WHERE used on the output table.

Visits is an aggregated column so has to be used here rather than in theWHERE clause.

List Popular pages

The following query gets all ASPX pages, ranks them in order of the number of requests for each and splits out the page from the page location to give a more nicely formatted result:

LogParser Syntax:
logparser -i IISW3C "SELECT SUBSTR(cs-uri-stem, 0, LAST_INDEX_OF(cs-uri-stem,'/')) 
AS location, SUBSTR(cs-uri-stem, ADD(LAST_INDEX_OF(cs-uri-stem,'/'),1) , 
STRLEN(cs-uri-stem)) as page, COUNT(*) AS requests 
FROM *.log GROUP BY cs-uri-stem HAVING SUBSTR(cs-uri-stem, LAST_INDEX_OF(cs-uri-stem,'.'), 
STRLEN(cs-uri-stem))  = '.aspx' ORDER BY requests DESC"


SELECT SUBSTR(cs-uri-stem, 0, LAST_INDEX_OF(cs-uri-stem,'/')) 
AS location, SUBSTR(cs-uri-stem, 
STRLEN(cs-uri-stem)) as page, 
COUNT(*) AS requests

SUBSTR and LAST_INDEX_OF are Log Parser functions and used here to split the url.

COUNT(*) counts the number of occurrences of each 'cs-uri-stem' (thanks to the GROUP BY clause)

FROM *.log 

Reads all .log files in the current directory

GROUP BY cs-uri-stem

Used to define the sets for aggregate functions in the SELECT clause

HAVING SUBSTR(cs-uri-stem, LAST_INDEX_OF(cs-uri-stem,'.'), 
STRLEN(cs-uri-stem)) = '.aspx' 

Filters for only those requests that end in .aspx

ORDER BY requests DESC

Ranks the results

Repeat parsing

All of the above is fine for a one off query, what if you want to save queries or run them at regular intervals?

Batch files

As a command line program, perhaps the easiest way to store queries or use them in Scheduled Tasks is to simply save commands like the above into Windows Batch files. This will allow you to pass parameters along to the query using batch file parameters (%1, %2 &.).

SQL files

Alternatively, it is possible to get Log Parser to use a saved SQL query from a file instead of the command line, e.g.:

logparser file:myquery.sql

This can take input parameters so the following (from the command-line help - logparser -h) is possible:

LogParser file:myQuery.sql?myInput=C:\temp\ex*.log+myOutput=results.csv

In the SQL file (which simply contains a statement like the ones specified above) the parameters are referenced by having the name in between percent symbols, in the above example the file might contain …FROM %myInput% …. The advantages of this approach are that in the SQL file you can break the query into more humanly readable block and that it is no longer limited by the maximum length of the command line input (260 characters). See

Extracting Address Search statistics

LogParser Syntax
LogParser -i:IISW3C -rtp:-1 "SELECT date, COUNT(*)
 into E:\Astun\Tools\LogParser\AddressSearches.txt
 from C:\Windows\System32\LogFiles\W3SVC1\*.log
 where cs-uri-query LIKE '%action=SetAddress%'
 group by date
 order by date"

Extracting Daily Counts of iShare Maps tabs / Mapsources

In the following example each mapsource uses a distinct .map file. My House uses, My Nearest uses and My Maps uses

SELECT Count(*), date, EXTRACT_TOKEN(cs-uri-query, 0, '&') as map_file FROM '[LOGFILEPATH]' 
where cs-uri-stem = '/mapserver/ms64' and (map_file like '' or map_file like '' or map_file like '')
Group by map_file, date