Tools Tips and Tricks #4: RML Utilities

It is time again for your friendly neighborhood (or virtual neighborhood) SQL CSS person to provide you with the next set of tips for the various tools that he uses in his day-to-day work. I probably use this tool “RML Utilities” or the database created by this at least once a day and sometimes even more. Not kidding!!

This tool has become synonymous with Profiler Trace analysis collected from customer environments in CSS. Majority of you might be familiar with the tool “SQL Nexus” which call RML utilities in the background to load the collected Profiler Traces into a SQL Server database. This calls an executable, ReadTrace.exe. In this post, I will talk about a few conditions when you should be calling ReadTrace.exe instead of invoking it from SQL Nexus.

image

As you can see above that SQL Nexus provides you the option of breaking down the activity of each SPID into individual .trc files but they can only be directed to the %TEMP%\RML folder on your machine. The previous server that I had didn’t have sufficient disk space to hold over 100GB of data in the TEMP directory. Yes, we do analyze data which runs into a few hundred GBs at times!

Now you have two options, change the %TEMP% environment path on your machine to a different drive or use the ReadTrace.exe to generate the .trc files in the required path.

Another option that cannot be controlled from SQL Nexus is the ability to add SPID, Hostname and Application filters while importing the data.

Some of the defaults for ReadTrace are shown below which are required for any profiler trace import are:

-o Output files to be generated in the current directory
-S Connect to default SQL Server instance on the machine
-d Load profiler data into PerfAnalysis database
-E Use windows authentication while connection.

Here are a few scenarios where I use ReadTrace.exe directly….

Analyzing a deadlock scenario where I know the Session IDs involved

Sometimes the deadlock graph is not sufficient to figure out why a lock is still held by that session and why that session acquired the lock on the object in the first place. At that point, you need to track down the sequence of statements executed by the session. In this case, I will use the following command to import the data.

readtrace -S. -E -ddbRMLUtilDemo
-I"D:\RML\RMLUtil_demo.trc" -A"!SQLCMD" -s53 -s55 -MF -o"D:\RML\output" -f

The above command will import the trace files into a database dbRMLUtilDemo on the default instance on the machine with a SPID filter of 53 and 55 and exclude events generated with the application name SQLCMD. Additionally, individual .trc files will be generated containing the activity of SPIDs 53 and 55.

Generating .RML files to be used by OStress

Use the –o parameter to change the output path and use the –s, –A and –H filters to apply the necessary SPID, application name and host name filters respectively. An example to apply filters for SPIDs 51, 52, 53, exclude events generated by application name PSSDIAG or hostname MyLaptop.

readtrace -S. -E -ddbRMLUtilDemo
-I"D:\RML\RMLUtil_demo.trc" -A"!SQLCMD" –s51 –s52 -s53 -o"D:\RML\output" –H"!MyLaptop" –A"!PSSDIAG"

The command line window and the ReadTrace.log file available in the output directory will confirm if the filters were applied correctly or not. Below is a screen shot of the command line window showing that the filters were applied correctly:

image

Handling MARS sessions

ReadTrace has limited processing ability for handling sessions using MARS (Multiple Active Result Sets). You have to use –f and –T35 to import data from trace files which contains MARS sessions. ReadTrace uses the formula ((Session Id * 10 ) + BatchId) so that a MARS sessions appears as a separate session. Eg. If you see Session ID 1620, 1621 and so on in the ReadTrace/SQLNexus database but cannot find such a session in the DMV outputs captured during that time, then the activity corresponds to Session ID 162 which was using MARS.

Note: And remember to always use a Server Side trace whenever possible to capture a profiler trace output to a local disk.

The PDF help files for RML Utilities are located at <system drive>:\Program Files\Microsoft Corporation\RMLUtils\Help folder by default on a machine where RML Utilities is installed in case you wanted more information on the above.

RML Utilities Download
http://blogs.msdn.com/b/psssql/archive/2008/11/12/cumulative-update-1-to-the-rml-utilities-for-microsoft-sql-server-released.aspx

Keep following the tag “Tools tips and Tricks” for more posts on the various tools that we use. That’s it for today!

Advertisement

Tools Tips and Tricks #3: Custom Rowsets using SQL Nexus

This post is part three of the Tools Tips and Tricks series that I started last week. In my T-SQL Tuesday post for this month, I already explained how I always have the inclination of importing data collected into a SQL Server database. I shall touch upon this yet again but this time through the use of SQL Nexus. I am going to use a small script to collect data for the user requests which are executing queries against a particular SQL Server instance. The script that I used to collect data is shown below:


set nocount on
go

while (1=1)
begin

print '-- RequestsRowset'

select GETDATE() as runtime,a.session_id as session_id,
a.start_time as start_time,
a.[status] as [status],
a.command as command,
a.database_id as database_id,
b.objectid
from sys.dm_exec_requests a
cross apply sys.dm_exec_sql_text(sql_handle) b
where session_id <> @@spid

raiserror ('', 0, 1) with nowait

waitfor delay '00:00:05'
end

I have stored the output captured by the above query in a file called CustomRowset.OUT. Using the Edit Custom Rowset option in the SQL Nexus UI (available in the third expandable tab named Data, on the left hand side), I can pull up a UI where I can specify the table name into which the data needs to be imported into (tbl_RequestsExample in this case) and the identifier for the data which needs to be imported (— RequestsRowset in this case). I was executing a WAITFOR DELAY command from another session while the above script was capturing data. I then import the data into a SQL Nexus database using the Importoption. Once this is done, I can then query the database and look into the results which were imported into the database. (see screenshot below). You can extend this functionality to any degree you want and even combine multiple T-SQL commands to capture different result sets in the same loop. You just need to make sure that the rowset identifier for each query result set is unique.

Additionally, if you attempt to capture outputs which have columns with data type length greater than 8000, then the import will fail with the following error in the SQL Nexus log file:

SQLNexus Information: 0 : RowsetImportEngine Error: An unexpected error has occurred:

System.Data.SqlClient.SqlException: The size (8192) given to the column ‘query_text’ exceeds the maximum allowed for any data type (8000).

SQL Nexus 3.0 doesn’t give you the option to add your own column data types using the UI.  Using the form (shown on the left) will treat all columns as varchar. If you want to change this behavior, then modify C:\Users\<user name>\AppData\Roaming\sqlnexus\TextRowsetsCustom.xml directly to add or modify the data types that you want.

Where is this helpful?
Let’s say I decided to collect the output of customized T-SQL script for an extended period of time. Instead of scouring the .txt or .out file manually using a text editor, I can import the data into a table and then run queries on them to save yourself a hair-raising experience and valuable time!

How do I make sure that the data is imported correctly?
1. If you have data which is larger than varchar(8000) in the result sets collected, then make sure to modify the TextRowsetsCustom.xml before you import the data.
2. Give each rowset that you collect an unique identifier. You don’t want the importer to mix-n-match the data you are importing.
3. Add a runtime column using GETDATE() or a variable for scripts capturing data in a loop to ensure that you can track the trend easily without having to second-guess.
4. All columns that are collected in the result set(s) have to be named.
5. Avoid CR/LFs in the result set i.e. don’t use CHAR(13) in your T-SQL script while capturing the data as this seriously confuses the importer as it treats CR/LFs as end-of-row indicator.
6. Either capture the data directly into a file by running the data collection script from Management Studio or use sqlcmd -W parameter if you are capturing the data using sqlcmd. -W ensures that trailing spaces from the result sets are removed.

That is all I have for today. Happy customizing and importing!

Two starts, one recompile and a confusing place

I used to be regular viewer of the TV Series “Two Guys, A Girl and A Pizza Place” co-starring Ryan Reynolds. The blog post title is a play on that name. In the past week, I have heard about this phenomenon of “one ghost Stmt: Starting” event more than thrice which is what triggered this blog post.

When you are analyzing such a trace file using RML Utilities, you will find an entry in the tblStatements table showing a Statement for a particular Batch Sequence (Column name: BatchSeq) which has a Start Time but no End Time. Additionally the AttnSeq column for that row would have a NULL value indicating that the statement didn’t timeout nor was it cancelled due to a Server/Client side error.

So what is this ghost entry all about. If you manually dig through the profiler trace, you will find that that the Statement in question has a Recompile event associated with it. This is by-design. When a Stored Procedure statement encounters a Recompile event, it would have a SP: StmtStarting event associated with it but no corresponding SP: StmtCompleted event. This can lead to confusion during trace file analysis.

Here is an example illustrating the same. I have a Stored Procedure whose definition is as follows:

CREATE PROC usp_recompile
AS
SELECT * FROM tbl1 OPTION (RECOMPILE)

The Recompile hint is added to forcefully induce a Recompile for the Stored Procedure when it is executed. If you look into the profiler events captured below during the Stored Procedure execution, you will see that there is a SP:StmtStarting event followed by a SP:Recompile and a SP:StmtRecompile event. Then there is another SP:StmtStarting event indicating that the recompile operation is complete and the start of the execution of the SP statement.

image

Hope this helps clear up the confusion around this topic.

How to use Reporter to analyze SQL Profiler traces

I had a question on the SQL Nexus discussion forum regarding the following error while using Reporter:

When I run Reporter, with the default settings (i.e. DB=PerfAnalysis), I get an error :
Server: (local) Error: 4060 Severity: 11 State: 1 Line: 65536 Source: .Net SqlClient Data Provider
Cannot open database "PerfAnalysis" requested by the login. The login failed.

When you launch Reporter which is part of the RML Utilities installation, the PerfAnalysis database is the default database that the Report will attempt to connect to. See illustration below:

image

You will need to change the baseline database to the database into which you imported the profiler traces using SQL Nexus. In my case, I had imported the SQL Profiler Traces captured into a database called nexus_blocking.

I shall cover a walkthrough on importing data into SQL Nexus sometime later.

Zeollar presentation on SQL Nexus

Get Microsoft Silverlight

If you have difficulty viewing this session, click here to view it in the original website.

Here is the recording of the webcast that I presented today! Feel free to contact me through this blog or post on the discussion forum on the SQL Nexus discussion forum.

In case you are not able to view the presentation, then you can view it directly on the Zeollar site.