Wasn't sure how many have been using Log Parser for their IIS logs, but thought I would write this up quickly so if you wish to import your IIS logs into a database, you can. Works quite well; especially when you're trying to track down a hack attempt, DNN error, etc.
Download Log Parser and install this onto your machine. After installing this, be sure to add it to your system path so that you can run it from a command prompt from whatever folder you're currently in. E.g. "C:\Program Files (x86)\Log Parser 2.2;"
Once you've installed this and added it to your system path, you can simply open a command prompt and enter the following:
LogParser "SELECT * INTO %TABLE% FROM \\%webserver%\c$\inetpub\logs\LogFiles\W3SVC1\u_ex120911*.log, \\%webserver%\c$\inetpub\logs\LogFiles\W3SVC1\u_ex120912*.log" -i:W3C -o:SQL -server:%DBSERVER% -database:%DATABASE% -driver:"SQL Server" -username:%SQLUSER% -password:%PASSWORD% -createTable:ON -clearTable:ON
%TABLE% - enter the table name which is going to hold your log information.
%webserver% - enter the name of your webserver which you want the logs to be inserted from. in the above example, I'm just going directly to the logfiles folder going to admin C$.
%DBSERVER% - enter the name or IP address of the SQL server which is hosting the logs database.
%DATABASE% - enter the name of the database which you want the logs inserted into
%SQLUSER% - enter the name of the SQL user used by Log Parser to insert the information. Obviously needs the proper permissions.
%PASSWORD% - the SQL users password.
-createTable:ON - this will automatically create the %TABLE% when log parser is run.
-clearTable:ON - this will truncate all the existing data within the %TABLE% when log parser is run.
If you look at the above example, you'll notice that it's going to import all the entries from the 11th (u_ex120911*.log) and the 12th (u_ex120912*.log). Depending on how many entries there are etc., depends on how long it takes to run. This took upwards for 10-15 minutes to complete for about a full week of logs. I have Task Scheduler configured to run this every morning to import the prior nights logs.
Another example I use is the one below. This will parse the logs and return to you all the IP addresses which have hit all URL's 20 or more times. This is good for tracking down illegal crawlers, spiders, etc.
LogParser "SELECT DISTINCT date, cs-uri-stem, c-ip, Count(*) AS Hits INTO %TABLE% FROM \\%webserver%\c$\inetpub\logs\LogFiles\W3SVC1\u_ex120911*.log, \\%webserver%\c$\inetpub\logs\LogFiles\W3SVC1\u_ex120912*.log GROUP BY date, c-ip, cs-uri-stem HAVING Hits>20 ORDER BY Hits Desc" -i:W3C -o:SQL -server:%DBSERVER% -database:%DATABASE% -driver:"SQL Server" -username:%SQLUSER% -password:%PASSWORD% -createTable:ON -clearTable:ON