The World's Favorite Open Source GPL J2EE CFML Runtime Engine

BlueDragon Developer's Journal

Subscribe to BlueDragon Developer's Journal: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get BlueDragon Developer's Journal: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

Open BlueDragon Authors: Reuven Cohen, Elizabeth White, Michael Sheehan, John Gauntt, Salvatore Genovese

Related Topics: Open BlueDragon Developer's Journal, ColdFusion on Ulitzer, Apache Web Server Journal

BlueDragon: Article

Monitoring Your ColdFusion Environment With the Free Log Parser Toolkit

Solve common challenges in ways no other monitoring feature in CF or even the dedicated monitoring tools can do

First note the use of the "group by application" clause, which changes the SQL to group all the records of the same application together. Then note the use in the SELECT clause of "COUNT(*) as NumErrors," which will count how many records there are in each such group (by application), reporting it as a NumErrors column. Finally, note the addition of application to the Select list to show the name of each application. The result of this might be:

Application NumError
--------------------------- --------
<NULL> 473
cfadmin 9
App1 1
test_admin 60
demoapps 3
ContactManager 1

Of course, if we wanted the list to be sorted either by the application names or the number of errors, we could add an "Order by" to the SQL.

Creating an Output File
The examples to this point have simply output the results to the screen (command line), but you may prefer to write the results to a file instead. You do this using an INTO clause, which is placed at the end of the SELECT (and before the FROM), to name a file, such as:

logparser "select * into test.txt from
C:\CFusionMX\logs\application.log where severity='Error'" -i:csv

Notice here that I've switched back to the earlier example of listing all the errors from the application.log file. Note as well that the INTO clause can specify a full or relative path (or a UNC path) for the file name, otherwise if none is provided, it will simply be stored in the current directory where the command takes place.

The file output format will default to what has been shown on screen to this point, which Log Parser calls a "NAT" format (native or tabular data format). But as mentioned at the outset, you can also cause the output (on screen or to a file) to be any of several other formats such as CSV, TSV, and XML. For instance, you could change the output file to an XML file:

logparser "select * into test.xml from
C:\CFusionMX\logs\application.log where severity='Error' " -i:csv o:xml

Creating an Output File Per Application: Finally Error Files Per App!
A unique feature of the Log Parser tool helps resolve a problem that has long plagued CF administrators. If you have multiple applications and development teams on one server, you (or they) have probably wished you could create a separate error log file for each application. Well, Log Parser makes this a snap, using a feature called "multiplexing."

It's really as simple as using wildcard characters in the INTO clause, which then correspond to the first column listed in the SELECT clause. Continuing the example above, recall that it's listing all the columns for all the messages that reflect an error. We can alter that to name the application column first, and while we're at it we may as well limit it to just that column, the filename, message, and date and time, as in:

logparser "select application,filename,message,date,time into errors_*.txt
from C:\CFusionMX\logs\application.log where severity='Error' " -i:csv -o:csv

Using the data in the log file reflected so far, where there were errors for the applications such as "App1" and "demoapps," this would create a file such as errors_App1.txt, with the errors for App1, and errors_demoapps.txt, with the errors for demoapps. You'll notice that I've changed the output format to CSV. For some reason, the default NAT format isn't supported for multiplexing, though TSV, XML, and others are. See the product docs for more information. This is really a very powerful mechanism. Imagine that after creating each app-specific file, you could go further and send each file to the appropriate application owner.

In fact, you can go further still to create a directory for each application instead. The secret is that you may use the * anywhere in the path. Just as the command above will create new files even if they didn't exist, you could have the tool create new directories as well. Consider the following:

logparser "select application,filename,message,date,time into *\errors.txt
from C:\CFusionMX\logs\application.log where severity='Error' " -i:csv -o:csv

Note the use of "*\errors.txt" for the Into clause. This will create a new directory for each application and store the file as errors.txt. (You could certainly repeat the * again if you wanted the application name in the file name too.) Imagine how you could now give the owners of each application access to "their" directory. And again, the path you name for the output directory could have used a UNC path to put the output on some shared server.

(As an aside regarding multiplexing, note that the first column listed is used only for determining the file name and doesn't appear in the result. If you really want it in the result, simply list the column twice.)

But How Do I Get Only the Latest Log Entries?
We're about out of space, but I'd like to demonstrate just one more really compelling feature. Regarding the last example, you may be thinking, "this all sounds kind of interesting, but it will stink as the logs grow and grow in size. I wish I could just get the latest log entries." That's a very logical concern. And like the old steak knife commercials on TV, the Log Parser tool makes me want to say, "but wait, there's more!"

The tool has a feature called checkpointing that's really as simple as naming a checkpoint file that will track the row number in the log file where processing stopped. It's really as simple as naming a checkpoint file. Consider the following:

logparser "select application,application,filename,message,date,time into *\errors_*.txt from
C:\CFusionMX\logs\application.log where severity='Error' " -i:csv -o:csv

Note the specification of the "-icheckpoint" argument, naming a file, in this case "test.ipc." Again, the checkpoint file specification can include a path so it can be put anywhere, or it will default to the directory where the logparser command is executed. The feature tracks where log processing has stopped and where subsequent log analysis should start. It can be used whether you're outputting the results to screen or file.

There's one thing to think about regarding use of this feature with the multiplexing (or indeed any log file output) operation: it will overwrite any previously existing file. This operation may make most sense when you're running the operation periodically and immediately sending the logfile somewhere else. That way, the recipient will get a new file whenever there's new content of interest to them. Nifty!

Other Log Parser Features: Too Many to Name
I could go on. Clearly, this is a powerful tool, and we've only scratched the surface. I've provided several resources to help you learn more and see more examples. Consider briefly the following additional features:
- Again beyond log files, the tool can analyze Web server logs, the metabase, the registry, the file system, windows event logs, the Active Directory, and so on
- Input files can not only be pointed to on local and UNC paths, but they can also be specified as URLs as well. So, for example, you can use this tool to read remote CSV and XML files (including RSS feeds)
- You can pipe the output of command-line tools to be processed as input to the tool
- You can store the report output in a database
- You can generate charts from the tool
- Besides simple tabular output to the screen, there is also an available Windows datagrid
- You can choose to store the SQL in a file and point to it from the command line, and even pass in parameters to such a file
- You can name more than one file in the FROM (filename,filename), and the tool will automatically connect the files together (a "union" in SQL terms)
- Since you can read in XML files, consider that this lets you perform SQL against the XML, which CFML can't currently do
- The SQL grammar available is quite substantial. I've only used the simplest examples
- It also offers many useful SQL functions, including ones unique to the tool, like PropCount and PropSum
- The -h command-line option also offers help on SQL grammar and shows examples as well
- And this is just some of what it does

Also, I can think of additional CF-specific kinds of analysis, of various input sources, that I'd like to elaborate. For instance, I've only showed working with the application.log file, and only the logs in the \cfusionmx\logs directory. Note that there are many other log files in ColdFusion that could be useful to analyze. (You can even use Log Parser's ability to query the file system to report what log files exist.) Of course, it's often useful to analyze Web server logs (whether IIS, Apache, or the built-in Web server in CFMX) to help identify performance problems.

Beyond that, there's data in the event logs (related to running CF services), as well as possibly data in the registry (such as client variables, perhaps existing unexpectedly). And since the tool can read XML files generated and available via a URL, it could be used to monitor and report on the output of both SeeFusion and FusionReactor, which can output XML files. I'll offer a URL at the end to name a place where I plan to elaborate on such opportunities as an extension to this article.

Resources for Learning More
Clearly, there's much more to learn. Perhaps the most important resource to start with is how to download the tool. There are a couple of links to note. The Microsoft link is The tool includes a help file with considerable extra detail. Beyond that, there's also an "unofficial" Web site at It includes a knowledge base, forums, and links to articles.

But the best resource of all, to really learn and appreciate the tool, is the book Microsoft Log Parser Toolkit, available from Syngress at Since it's a couple of years old, the book is also available used at Amazon. Even so, it's an awesome resource. You'll learn things you never dreamed were possible.You may enjoy the following articles:


  • "How Log Parser 2.2 Works"
  • "Analyze Web Stats with Log Parser"
  • "What's New in Log Parser 2.2"

    Finally, as I alluded to above, I plan to create an area on my Web site to expand on this discussion and elaborate more CF-specific aspects of using Log Parser. Look for that at I welcome your feedback.

    Editorial Note: Since writing this article the author updated  significantly that resource he referred to, with links to more about Log parser, at

  • More Stories By Charlie Arehart

    A veteran ColdFusion developer since 1997, Charlie Arehart is a long-time contributor to the community and a recognized Adobe Community Expert. He's a certified Advanced CF Developer and Instructor for CF 4/5/6/7 and served as tech editor of CFDJ until 2003. Now an independent contractor ( living in Alpharetta, GA, Charlie provides high-level troubleshooting/tuning assistance and training/mentoring for CF teams. He helps run the Online ColdFusion Meetup (, an online CF user group), is a contributor to the CF8 WACK books by Ben Forta, and is frequently invited to speak at developer conferences and user groups worldwide.

    Comments (5) View Comments

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

    Most Recent Comments
    Charlie Arehart 11/14/09 12:33:00 PM EST

    I will also note that since I wrote the article, I did go on to update significantly that resource I referred to, with links to more about Log parser, here.

    Charlie Arehart 03/02/08 09:47:14 PM EST

    You will now find the Log Parser tool available at the MS site at:

    John 02/28/08 09:19:21 AM EST

    A similar tool for unix and apache logs is aql[1]. While nowhere near as adapable it is quick and simple. Also good on Unix is logwatch[2].


    charlie arehart 01/06/07 11:38:26 AM EST

    Thanks, Stefan. Glad to hear it was valuable for you. I always welcome feedback, good or bad. :-)

    Stefan le Roux 11/13/06 06:07:24 AM EST

    Lovely, I once wrote an application to read log files line by line, but the Log Parser really makes it easy to run most types of queries against these files without first having to write it to a database.