Tools Tips and Tricks #7: PsExec as parent and ProcMon as child

A new week begins and the Tools, Tips and Tricks series continues. Today I shall talk about using PsExec (part of the PsTools package) to collect data from a remote box through PsExec. PsExec lets you execute processes on other systems without having to manually install client software.

The scenario that I will discuss today is collection of a Process Monitor trace on a server without logging onto the machine. This is quite useful if you need to collect a Process Monitor trace from one of the passive nodes while running SQL Server 2005 setup.

The .cmd file that I am using in this case has the following commands:


set PM=D:\ProcMon\procmon.exe

start %PM% /quiet /minimized /AcceptEULA /backingfile D:\ProcMon\notepad.pml

%PM% /waitforidle

ping 127.0.0.1 -n 10 -w 100

%PM% /terminate

I am running the .cmd file to spawn Process Monitor on a different server and capture a trace using a local file as the backing file. The reason why we would have to run Process Monitor from command line is because while running setup, it’s recommended that you do not log into the passive nodes using Terminal Services or Console Mode.

The command that I used to execute the commands present in the .cmd file is

PsExec \\<server name> -c RemoteProcMon.cmd

If you do not prefer the ping method to simulate waits, you can use the SLEEP utility available with Windows 2003 Resource Kit. This utility takes a parameter for milli-seconds to wait. If you need to set filters for Process Monitor, then run the Process Monitor tool in GUI mode and set the Filters and save it. After that Exit Process Monitor and run the above command from Command Line. Or you could alternatively create a configuration file and use the /LoadConfig command line switch for Process Monitor.

Again the possibilities are endless and you can extend this by using the command line options for PsExec and Process Monitor.

Use the options mentioned in Tools Tips and Tricks #1: Process Monitor so that the filters are correctly set and the page file doesn’t get overwhelmed if you are capturing data for long periods or on servers where there is high amount of activity.

Tools Tips and Tricks #6: Custom Reports in SQL Nexus

Here is another post in the Tools Tips and Tricks series which tells you some feature about SQL Nexus which is not widely used. Yes, SQL Nexus is one of my favorite tools (provided it is used correctly). Automation of analysis helps get the picture quickly but the co-relation of data has to be drawn by the person performing the analysis.

Today I shall show you how you can use custom reports feature by creating simple RDL files in Visual Studio and getting SQL Nexus to use them.

As you can see in the screenshot below, I have a few reports which are not part of the standard installation of SQL Nexus available on the CodePlex site.

I had built this report ages ago when RML Utilities didn’t have the option of drilling through the Interesting Events and fetching out the sub-class reason for the Interesting Event.

image

The report structure itself is quite simple. It uses the Shared Data imageSource “sqlnexus.rds” so that the database context can switch when you change the database name using the drop-down menu in the SQL Nexus tool. Then I created an Action Event to jump to a child report which is called “Event Drilldown.rdlc”. This gives you a view as shown in the screenshot below. This is showing you a on which database the AutoStats event was fired and what the reason for AutoStats kicking in.

image

Once you have the custom reports built, you can drop them in the following folder: %appdata%\SQLNexus\Reports. SQL Nexus will automatically pickup these reports when it is launched.

The two files that are used in the above example can be downloaded from here. You can dissect the structures of the report to understand what queries were used to fetch the data for the reports shown above.

The ReadTrace_Main report for RML Utilities provides this drill-down function currently using the “Interesting Events” link on the main dashboard under “Additional Reports”.

Have a good weekend and stay tuned to this series for more tips and tricks next week!

Download linke for RDL files: Client Report Definition (.rdlc) Files

Tools Tips and Tricks #5: SQLDIAG and RANU

During the course of the Tools Tips and Tricks series, I had blogged about how to connect to a RANU instance and how to find out the SQL Express user instances active on the machine. Today, I shall discuss how to use SQLDIAG to collect data from a SQL Express user instance.

Since a SQL Express User Instance involves the dynamic spawning of a new instance and connections are allowed locally only using named pipes, normal instance discovery doesn’t work. If you need to collect diagnostic data using SQLDIAG for such an instance, you need to take the following steps:

1. Start the user instance by instantiating a connection to the user instance from the application that uses it.
2. Use any method mentioned in my previous post to get the instance GUID and the named pipe string.
3. Construct a named pipe alias using SQL Server Configuration Manager (see screenshot below). Use SQL Native Client Configuration 32-bit or 64-bit as appropriate for your version of SQL Express.

image

The Alias Name is in the form of <server name>\<user instance GUID>. The pipe name is what you obtained for the user instance that you are interested in monitoring. Leaver the Server name as blank.
4. Once the alias is created, test if you can connect to it using SQLCMD –S:.\BA78E627-AD14-4 –E locally from the machine that has SQL Express installed on it.
5. Now in the SQLDIAG.XML configuration file that you are using put the server name and instance name as follows:

<Machine name=".">
….
<Instance name="BA78E627-AD14-4" windowsauth="true" ssver="10.5" user="">

After this you can run SQLDIAG from the machine which has SQL Express installed on it as user instances don’t accept remote connections.

Check back tomorrow for another new trick for tools that we use!

Tools Tips and Tricks #4: RML Utilities

It is time again for your friendly neighborhood (or virtual neighborhood) SQL CSS person to provide you with the next set of tips for the various tools that he uses in his day-to-day work. I probably use this tool “RML Utilities” or the database created by this at least once a day and sometimes even more. Not kidding!!

This tool has become synonymous with Profiler Trace analysis collected from customer environments in CSS. Majority of you might be familiar with the tool “SQL Nexus” which call RML utilities in the background to load the collected Profiler Traces into a SQL Server database. This calls an executable, ReadTrace.exe. In this post, I will talk about a few conditions when you should be calling ReadTrace.exe instead of invoking it from SQL Nexus.

image

As you can see above that SQL Nexus provides you the option of breaking down the activity of each SPID into individual .trc files but they can only be directed to the %TEMP%\RML folder on your machine. The previous server that I had didn’t have sufficient disk space to hold over 100GB of data in the TEMP directory. Yes, we do analyze data which runs into a few hundred GBs at times!

Now you have two options, change the %TEMP% environment path on your machine to a different drive or use the ReadTrace.exe to generate the .trc files in the required path.

Another option that cannot be controlled from SQL Nexus is the ability to add SPID, Hostname and Application filters while importing the data.

Some of the defaults for ReadTrace are shown below which are required for any profiler trace import are:

-o Output files to be generated in the current directory
-S Connect to default SQL Server instance on the machine
-d Load profiler data into PerfAnalysis database
-E Use windows authentication while connection.

Here are a few scenarios where I use ReadTrace.exe directly….

Analyzing a deadlock scenario where I know the Session IDs involved

Sometimes the deadlock graph is not sufficient to figure out why a lock is still held by that session and why that session acquired the lock on the object in the first place. At that point, you need to track down the sequence of statements executed by the session. In this case, I will use the following command to import the data.

readtrace -S. -E -ddbRMLUtilDemo
-I"D:\RML\RMLUtil_demo.trc" -A"!SQLCMD" -s53 -s55 -MF -o"D:\RML\output" -f

The above command will import the trace files into a database dbRMLUtilDemo on the default instance on the machine with a SPID filter of 53 and 55 and exclude events generated with the application name SQLCMD. Additionally, individual .trc files will be generated containing the activity of SPIDs 53 and 55.

Generating .RML files to be used by OStress

Use the –o parameter to change the output path and use the –s, –A and –H filters to apply the necessary SPID, application name and host name filters respectively. An example to apply filters for SPIDs 51, 52, 53, exclude events generated by application name PSSDIAG or hostname MyLaptop.

readtrace -S. -E -ddbRMLUtilDemo
-I"D:\RML\RMLUtil_demo.trc" -A"!SQLCMD" –s51 –s52 -s53 -o"D:\RML\output" –H"!MyLaptop" –A"!PSSDIAG"

The command line window and the ReadTrace.log file available in the output directory will confirm if the filters were applied correctly or not. Below is a screen shot of the command line window showing that the filters were applied correctly:

image

Handling MARS sessions

ReadTrace has limited processing ability for handling sessions using MARS (Multiple Active Result Sets). You have to use –f and –T35 to import data from trace files which contains MARS sessions. ReadTrace uses the formula ((Session Id * 10 ) + BatchId) so that a MARS sessions appears as a separate session. Eg. If you see Session ID 1620, 1621 and so on in the ReadTrace/SQLNexus database but cannot find such a session in the DMV outputs captured during that time, then the activity corresponds to Session ID 162 which was using MARS.

Note: And remember to always use a Server Side trace whenever possible to capture a profiler trace output to a local disk.

The PDF help files for RML Utilities are located at <system drive>:\Program Files\Microsoft Corporation\RMLUtils\Help folder by default on a machine where RML Utilities is installed in case you wanted more information on the above.

RML Utilities Download
http://blogs.msdn.com/b/psssql/archive/2008/11/12/cumulative-update-1-to-the-rml-utilities-for-microsoft-sql-server-released.aspx

Keep following the tag “Tools tips and Tricks” for more posts on the various tools that we use. That’s it for today!

Tools Tips and Tricks #3: Custom Rowsets using SQL Nexus

This post is part three of the Tools Tips and Tricks series that I started last week. In my T-SQL Tuesday post for this month, I already explained how I always have the inclination of importing data collected into a SQL Server database. I shall touch upon this yet again but this time through the use of SQL Nexus. I am going to use a small script to collect data for the user requests which are executing queries against a particular SQL Server instance. The script that I used to collect data is shown below:


set nocount on
go

while (1=1)
begin

print '-- RequestsRowset'

select GETDATE() as runtime,a.session_id as session_id,
a.start_time as start_time,
a.[status] as [status],
a.command as command,
a.database_id as database_id,
b.objectid
from sys.dm_exec_requests a
cross apply sys.dm_exec_sql_text(sql_handle) b
where session_id <> @@spid

raiserror ('', 0, 1) with nowait

waitfor delay '00:00:05'
end

I have stored the output captured by the above query in a file called CustomRowset.OUT. Using the Edit Custom Rowset option in the SQL Nexus UI (available in the third expandable tab named Data, on the left hand side), I can pull up a UI where I can specify the table name into which the data needs to be imported into (tbl_RequestsExample in this case) and the identifier for the data which needs to be imported (— RequestsRowset in this case). I was executing a WAITFOR DELAY command from another session while the above script was capturing data. I then import the data into a SQL Nexus database using the Importoption. Once this is done, I can then query the database and look into the results which were imported into the database. (see screenshot below). You can extend this functionality to any degree you want and even combine multiple T-SQL commands to capture different result sets in the same loop. You just need to make sure that the rowset identifier for each query result set is unique.

Additionally, if you attempt to capture outputs which have columns with data type length greater than 8000, then the import will fail with the following error in the SQL Nexus log file:

SQLNexus Information: 0 : RowsetImportEngine Error: An unexpected error has occurred:

System.Data.SqlClient.SqlException: The size (8192) given to the column ‘query_text’ exceeds the maximum allowed for any data type (8000).

SQL Nexus 3.0 doesn’t give you the option to add your own column data types using the UI.  Using the form (shown on the left) will treat all columns as varchar. If you want to change this behavior, then modify C:\Users\<user name>\AppData\Roaming\sqlnexus\TextRowsetsCustom.xml directly to add or modify the data types that you want.

Where is this helpful?
Let’s say I decided to collect the output of customized T-SQL script for an extended period of time. Instead of scouring the .txt or .out file manually using a text editor, I can import the data into a table and then run queries on them to save yourself a hair-raising experience and valuable time!

How do I make sure that the data is imported correctly?
1. If you have data which is larger than varchar(8000) in the result sets collected, then make sure to modify the TextRowsetsCustom.xml before you import the data.
2. Give each rowset that you collect an unique identifier. You don’t want the importer to mix-n-match the data you are importing.
3. Add a runtime column using GETDATE() or a variable for scripts capturing data in a loop to ensure that you can track the trend easily without having to second-guess.
4. All columns that are collected in the result set(s) have to be named.
5. Avoid CR/LFs in the result set i.e. don’t use CHAR(13) in your T-SQL script while capturing the data as this seriously confuses the importer as it treats CR/LFs as end-of-row indicator.
6. Either capture the data directly into a file by running the data collection script from Management Studio or use sqlcmd -W parameter if you are capturing the data using sqlcmd. -W ensures that trailing spaces from the result sets are removed.

That is all I have for today. Happy customizing and importing!