Running SQL Nexus using Command Line Parameters

SQL Nexus is a tool that helps you identify the root cause of SQL Server performance issues. It loads and analyzes performance data collected by SQLDiag and PSSDiag. It can dramatically reduce the amount of time you spend manually analyzing data. One of the least commonly known facts is that the SQL Nexus tool also allows you to run it’s reports using command line arguments or even import diagnostic data automatically. The command line help reference is shown below. You can find more details on GitHub.

sqlnexus.exe [ [ /S [ /E | /Uuser /Ppwd ] [/D"database"] ] | [/C"connstr"] ] [/Iinputpath] [/Ooutputpath] [/Rreport] [/X]

sqlnexus.exe [ [ /S [ /E | /Uuser /Ppwd ] [/D"database"] ] | [/C"connstr"] ] [/Iinputpath] [/Ooutputpath] [/Rreport] [/X] [/Q] [/N]

/S"server"      Specifies a SQL Server name to connect to.
/D"database"    Database to connect to
/E              Log in to SQL using Windows/integrated security
/Uuser          Specifies a SQL (non-Windows) login name
/Ppassword      Specifies the password for a SQL (non-Windows) login
/C"connstring"  Specifies the SqlClient ConnectionString to use (can be used instead of /S /E).
/I"path"        Import SQL diagnostic data from this path
/R"report"      Specifies a report file name to run.
/O"path"        Specifies an export path for reports executed via /R. Also this is where the sqlnexus.000.log would get created
/X              Exit after importing (/I) or exporting (/O) the specified report (/R)
/Vparam=value   Specify the value of a form parameter
/Q              Quiet Mode - minimize windows in console mode
/N              Create a new SQLNexus database before importing (drop existing).

Automating Report Execution

The first scenario that I want to outline is the need to automate a report execution especially when you need to execute the reports repeatedly after activities like performance testing or during some other iterative activity which requires analysis of various sets of diagnostic data.

This can be achieved using the command line parameters shown below:

sqlnexus.exe /S"." /X /D"sqlnexus" /R"\Summary.rdl" /O""

As you can see from the screenshot below, you can see that I have a standard report i.e. “Bottleneck Analysis_C.xls” that was exported and a non-standard report which I created Summary.xls in the same folder. This also gives additional opportunity to create your own performance reports and schedule them using the SQL Nexus executable.

Report Output

After you have the exported reports, you could also write additional automation to email the reports as required.

Automating Data Import

SQL Nexus also allows you to perform data imports automatically using the command line parameters. The command line shown below performs data import from the specified folder into a SQL Server database called sqlnexustest on the default SQL server instance installed on the machine.

sqlnexus.exe /S"." /X /D"sqlnexustest" /I"C:\temp\output"

If the database does not exist on the SQL Server instance, then it will be created. The settings for SQL Nexus utility will be used during the data import which were saved when the last time the UI was used. So if you have disabled the profiler trace import from the UI and saved your settings, then the command line execution will not import the profiler traces.

Advertisement

T-SQL Tuesday #49: Why was SQL waiting?

A lot of my customer engagements involve identifying waits experienced by SQL Server queries on a production environment. Since this month’s topic is WAITS… I see this as a good opportunity to talk about how I go about the analysis of the collected data to identify the blocking chains.

Before I start rambling off, a note of thanks to my friend Robert Davis [B|T] for hosting this month’s T-SQL Tuesday! He couldn’t have chosen a better topic! 🙂

In this post, I will be talking about how to slice and dice the diagnostic data collected by SQL Perf Stats script that is imported into a SQL Nexus database.

The tables of interest in the SQL Nexus database are:

1. tbl_Requests – This table holds the combined output of the snapshots collected by the Perf Stats script from the sys.dm_exec_requests, sys.dm_exec_sessions, sys.dm_exec_connections and sys.dm_os_tasks

2. tbl_NotableActiveQueries – This table holds the output sys.dm_exec_sql_text and sys.dm_exec_query_stats snapshots collected by the Perf Stats script.

The file that needs to be present when SQL Nexus is imported the collected diagnostic data is <server name>__SQL_2008_Perf_Stats_Startup.OUT. The plethora of information available in this file can be mind boggling at times. So pushing all this information into a database makes a lot of sense.

The first order of the day is to determine if there was actually blocking experienced during the data collection. The output of the query will give you collected snapshots where blocking was detected.

<span style="font-size: small;">select runtime,count (*) as blocked_processes
from tbl_requests
where blocking_session_id > 0 and session_id <> blocking_session_id
group by runtime
having count(*) > 1
order by runtime

The next order of the day is to find out which sessions were blocked. The SQL Nexus reports do a good job of drilling down into each blocking session observed. But if you want a collated view of all the blocking along with the list of head blockers, then the following T-SQL script could prove useful especially when the .OUT file being imported is a few hundred MBs in size! This would help you slice and dice the MBs or GBs of data that was imported into the SQL Nexus database tables and get a consolidated view of the blocking chains.

/*
This Sample Code is provided for the purpose of illustration only and is not intended to be used in a production environment. THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. We grant You a nonexclusive, royalty-free right to use and modify the Sample Code and to reproduce and distribute the object code form of the Sample Code, provided that You agree: (i) to not use Our name, logo, or trademarks to market Your software product in which the Sample Code is embedded; (ii) to include a valid copyright notice on Your software product in which the Sample Code is embedded; and (iii) to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of the Sample Code.

Author: Amit Banerjee

Modified: December 10, 2013

Description:

This T-SQL script extracts information about the blocking chains from the SQL Nexus database

*/

set nocount on
print '================== Blocking Analysis ====================='
print ''
print 'Runtime Information'
select min(runtime) as [Start Time], max(runtime) as [End Time] from tbl_Requests

if ((select count(*) from tbl_requests where blocking_session_id <> 0) > 0)
begin
print 'Blocking found'
print ''
print '============= Blocking Chains (Runtimes considered where blocking_session_id processes > 5) ==================='
-- Show a summary of the # of blocked sessions for every snapshot collected by the PerfStats script
select runtime,count (*) as blocked_processes
from tbl_requests
where blocking_session_id > 0 and session_id <> blocking_session_id
group by runtime
having count(*) > 1 -- The threshold can be changed as appropriate
order by runtime

print ''
print ''
print '================= Head Blocker Information ==================='
if (object_id('tmp_runtimes') <> NULL)
begin
drop table tmp_runtimes
end

-- Create a table to store the runtimes
CREATE TABLE tmp_runtimes(
[row_num] [bigint] IDENTITY(1,1) NOT NULL,
[runtime] datetime
) ON [PRIMARY]

-- Create a list of the distinct runtimes
insert into tmp_runtimes
select distinct (runtime)
from tbl_requests
where blocking_session_id <> 0
order by runtime

-- Start a while loop to fetch the data for the head blocker and blocked sessions
declare @count int,@loop int,@runtime datetime
select @count = count(*) from tmp_runtimes
set @loop = 1
while (@loop <> @count)
begin
select @runtime = runtime from tmp_runtimes where row_num = @loop

print '~~~~~~~~~~~~~~~ Runtime: '
select @runtime
print ''
print '~~~ Head Blocker'

select a.session_id,a.ecid,a.blocking_ecid,wait_type,wait_duration_ms,command,b.dbname,b.procname,b.stmt_text
from tbl_requests a
left outer join tbl_notableactivequeries b
on a.runtime = b.runtime and a.session_id = b.session_id
where blocking_session_id = 0 and a.runtime = @runtime and a.session_id in
(select blocking_session_id
from tbl_requests where blocking_session_id <> 0 and runtime = @runtime)

print ''
print '~~~ Blocked Session Information'
select *
from tbl_requests a
left outer join tbl_notableactivequeries b
on a.runtime = b.runtime and a.session_id = b.session_id
where a.blocking_session_id <> 0
and a.runtime = @runtime
order by a.session_id,a.blocking_session_id

set @loop = @loop + 1

end

drop table tmp_runtimes

end

else
begin
print 'No Blocking found'
end

set nocount off

With the help of the above output, you can get a collated view of all the blocking chains encountered during the data collection period. CSS engineers are Microsoft use SQL Nexus extensively for analyzing diagnostic data collected for SQL Server performance issues. I hope you find this script useful in analyzing blocking chains and do keep the feedback coming in. The above script is available for download on SkyDrive as well.

Please note that the PerfStats script collects data at 10 second intervals by default! If your blocking chains are for shorter durations then you will need to change the waitfor delay interval. If your blocking duration is lesser than a second, then data collection using the DMVs would not be of much help and you would need to go down the path of Extended Events tracing or other invasive forms of data collection (like XPerf utility).

Tools Tips and Tricks #6: Custom Reports in SQL Nexus

Here is another post in the Tools Tips and Tricks series which tells you some feature about SQL Nexus which is not widely used. Yes, SQL Nexus is one of my favorite tools (provided it is used correctly). Automation of analysis helps get the picture quickly but the co-relation of data has to be drawn by the person performing the analysis.

Today I shall show you how you can use custom reports feature by creating simple RDL files in Visual Studio and getting SQL Nexus to use them.

As you can see in the screenshot below, I have a few reports which are not part of the standard installation of SQL Nexus available on the CodePlex site.

I had built this report ages ago when RML Utilities didn’t have the option of drilling through the Interesting Events and fetching out the sub-class reason for the Interesting Event.

image

The report structure itself is quite simple. It uses the Shared Data imageSource “sqlnexus.rds” so that the database context can switch when you change the database name using the drop-down menu in the SQL Nexus tool. Then I created an Action Event to jump to a child report which is called “Event Drilldown.rdlc”. This gives you a view as shown in the screenshot below. This is showing you a on which database the AutoStats event was fired and what the reason for AutoStats kicking in.

image

Once you have the custom reports built, you can drop them in the following folder: %appdata%\SQLNexus\Reports. SQL Nexus will automatically pickup these reports when it is launched.

The two files that are used in the above example can be downloaded from here. You can dissect the structures of the report to understand what queries were used to fetch the data for the reports shown above.

The ReadTrace_Main report for RML Utilities provides this drill-down function currently using the “Interesting Events” link on the main dashboard under “Additional Reports”.

Have a good weekend and stay tuned to this series for more tips and tricks next week!

Download linke for RDL files: Client Report Definition (.rdlc) Files

Tools Tips and Tricks #3: Custom Rowsets using SQL Nexus

This post is part three of the Tools Tips and Tricks series that I started last week. In my T-SQL Tuesday post for this month, I already explained how I always have the inclination of importing data collected into a SQL Server database. I shall touch upon this yet again but this time through the use of SQL Nexus. I am going to use a small script to collect data for the user requests which are executing queries against a particular SQL Server instance. The script that I used to collect data is shown below:


set nocount on
go

while (1=1)
begin

print '-- RequestsRowset'

select GETDATE() as runtime,a.session_id as session_id,
a.start_time as start_time,
a.[status] as [status],
a.command as command,
a.database_id as database_id,
b.objectid
from sys.dm_exec_requests a
cross apply sys.dm_exec_sql_text(sql_handle) b
where session_id <> @@spid

raiserror ('', 0, 1) with nowait

waitfor delay '00:00:05'
end

I have stored the output captured by the above query in a file called CustomRowset.OUT. Using the Edit Custom Rowset option in the SQL Nexus UI (available in the third expandable tab named Data, on the left hand side), I can pull up a UI where I can specify the table name into which the data needs to be imported into (tbl_RequestsExample in this case) and the identifier for the data which needs to be imported (— RequestsRowset in this case). I was executing a WAITFOR DELAY command from another session while the above script was capturing data. I then import the data into a SQL Nexus database using the Importoption. Once this is done, I can then query the database and look into the results which were imported into the database. (see screenshot below). You can extend this functionality to any degree you want and even combine multiple T-SQL commands to capture different result sets in the same loop. You just need to make sure that the rowset identifier for each query result set is unique.

Additionally, if you attempt to capture outputs which have columns with data type length greater than 8000, then the import will fail with the following error in the SQL Nexus log file:

SQLNexus Information: 0 : RowsetImportEngine Error: An unexpected error has occurred:

System.Data.SqlClient.SqlException: The size (8192) given to the column ‘query_text’ exceeds the maximum allowed for any data type (8000).

SQL Nexus 3.0 doesn’t give you the option to add your own column data types using the UI.  Using the form (shown on the left) will treat all columns as varchar. If you want to change this behavior, then modify C:\Users\<user name>\AppData\Roaming\sqlnexus\TextRowsetsCustom.xml directly to add or modify the data types that you want.

Where is this helpful?
Let’s say I decided to collect the output of customized T-SQL script for an extended period of time. Instead of scouring the .txt or .out file manually using a text editor, I can import the data into a table and then run queries on them to save yourself a hair-raising experience and valuable time!

How do I make sure that the data is imported correctly?
1. If you have data which is larger than varchar(8000) in the result sets collected, then make sure to modify the TextRowsetsCustom.xml before you import the data.
2. Give each rowset that you collect an unique identifier. You don’t want the importer to mix-n-match the data you are importing.
3. Add a runtime column using GETDATE() or a variable for scripts capturing data in a loop to ensure that you can track the trend easily without having to second-guess.
4. All columns that are collected in the result set(s) have to be named.
5. Avoid CR/LFs in the result set i.e. don’t use CHAR(13) in your T-SQL script while capturing the data as this seriously confuses the importer as it treats CR/LFs as end-of-row indicator.
6. Either capture the data directly into a file by running the data collection script from Management Studio or use sqlcmd -W parameter if you are capturing the data using sqlcmd. -W ensures that trailing spaces from the result sets are removed.

That is all I have for today. Happy customizing and importing!

SQL Nexus: What StartTimeInterval?

When I get asked the same question more than 3 times in less than 48 hours, I guess it’s time to create some documentation around the error.

The issue is when you click on the UniqueStatements link in the ReadTrace_Main report in SQL Nexus, you get the following exception:

image

If you collect a profiler trace, you will find that the error is raised while executing the following command:

exec ReadTrace.spReporter_StmtTopN @StartTimeInterval, @EndTimeInterval, @TopN, @Filter1, @Filter2, @Filter3, @Filter4, @Filter1Name, @Filter2Name, @Filter3Name, @Filter4Name

The error message would be:

Error: 137, Severity: 15, State: 2
Must declare the scalar variable "@StartTimeInterval".

The above error will be reported in the exception message that is raised in SQL Nexus as well. The issue is not with SQL Nexus but with the RML Utilities report. You can get this issue under the following condition:

1. There are no entries in the readtrace.tblStatements and readtrace.tblTimeIntervals tables.
2. Or if you are using an older version of ReadTrace (RML Utilities)

I was able to reproduce the issue on ReadTrace.exe build of 9.01.0109.

The issue stems from the fact that when you click on the UniqueStatements report link and no parameters are specified, thimagee Start Time and the End Time to be analyzed is picked up from the tblTimeIntervals. I was able to reproduce the issue on the same SQL Nexus database using one version of ReadTrace reports and not with the other. You can send an email to sstlbugs@microsoft.com and request the latest build for RML Utilities with the ReadTrace and SQL Nexus log file as attachments. The logs can be retrieved by clicking on the hyperlinks in SQL Nexus shown in the picture on the right.

del.icio.us Tags: ,,