Archive for the ‘Tools’ Category

Running SQL Nexus using Command Line Parameters   1 comment


SQL Nexus is a tool that helps you identify the root cause of SQL Server performance issues. It loads and analyzes performance data collected by SQLDiag and PSSDiag. It can dramatically reduce the amount of time you spend manually analyzing data. One of the least commonly known facts is that the SQL Nexus tool also allows you to run it’s reports using command line arguments or even import diagnostic data automatically. The command line help reference is shown below:



/Sserver

Specifies a SQL Server name to connect to

/D”database”

Database to connect to

/E

Log in to SQL using Windows/integrated security

/Uuser

Specifies a SQL (non-Windows) login name

/Ppassword

Specifies the password for a SQL (non-Windows) login

“/Cconnstring”

Specifies the SqlClient ConnectionString to use (can be used instead of /S /E).

“/Ipath”

Import SQL diagnostic data from this path

“/Rreport”

Specifies a report file name to run.

“/Opath”

Specifies an export path for reports executed via /R

/X

Exit after importing (/I) or exporting (/O) the specified report

(/R)

 

/Vparam=value

Specify the value of a form parameter

 

Automating Report Execution

The first scenario that I want to outline is the need to automate a report execution especially when you need to execute the reports repeatedly after activities like performance testing or during some other iterative activity which requires analysis of various sets of diagnostic data.

This can be achieved using the command line parameters shown below:

sqlnexus.exe /S”.” /X /D”sqlnexus” /R”<full or relative path>\Summary.rdl” /O”<path where the log file and report should be exported>”

As you can see from the screenshot below, you can see that I have a standard report i.e. “Bottleneck Analysis_C.xls” that was exported and a non-standard report which I created Summary.xls in the same folder. This also gives additional opportunity to create your own performance reports and schedule them using the SQL Nexus executable.

Report Output

After you have the exported reports, you could also write additional automation to email the reports as required.

Automating Data Import

SQL Nexus also allows you to perform data imports automatically using the command line parameters. The command line shown below performs data import from the specified folder into a SQL Server database called sqlnexustest on the default SQL server instance installed on the machine.

sqlnexus.exe /S”.” /X /D”sqlnexustest” /I”C:\temp\output”

If the database does not exist on the SQL Server instance, then it will be created. The settings for SQL Nexus utility will be used during the data import which were saved when the last time the UI was used. So if you have disabled the profiler trace import from the UI and saved your settings, then the command line execution will not import the profiler traces.

Posted May 12, 2015 by Amit Banerjee in Tools

Tagged with ,

WOOT: Default Trace and Power View   2 comments


I have been working on building visualizations for various kinds of analysis that I perform for my customers. One such useful visualization was the use of Power View for analyzing the data available in the SQL Server Default Trace. The query below lets you retrieve all the information in the default traces. This same query is used to populate the Power Pivot table in the Excel file.


declare @enable int

-- Check to find out if Default Server Side traces are running
select top 1 @enable = convert(int,value_in_use) from sys.configurations where name = 'default trace enabled'

if @enable = 1 --default trace is enabled
begin

declare @d1 datetime;
declare @diff int;
declare @curr_tracefilename varchar(500);
declare @base_tracefilename varchar(500);
declare @indx int ;

select @curr_tracefilename = path from sys.traces where is_default = 1 ;

set @curr_tracefilename = reverse(@curr_tracefilename)
select @indx = PATINDEX('%\%', @curr_tracefilename)
set @curr_tracefilename = reverse(@curr_tracefilename)
set @base_tracefilename = LEFT( @curr_tracefilename,len(@curr_tracefilename) - @indx) + '\log.trc';

select EventCat.name as Category, EventID.name as EventName, Events.*
from ::fn_trace_gettable( @base_tracefilename, default ) Events
inner join sys.trace_events EventID
on Events.EventClass = EventID.trace_event_id
inner join sys.trace_categories EventCat
on EventID.category_id = EventCat.category_id

end

Once I have all the trace data available in my Power Pivot table, I created calculated columns for Day, Hour and Minute. Now that I have all the data readily available for me, I went about creating the main dashboard which provides a view of all the events that occurred along with a time line view. All this took me less than 5 minutes after I had finished writing the query! Pretty quick. Now I have an interactive report that I can use for performing various kinds of analysis.

The screenshot below will show that there was only one event raised for the Server event category and the actual time of occurrence is shown in the line graph. A simple mouse over on the point will give you the exact details. Now isn’t that a simple way to track down events! Smile

image

I will provide a final version of the Excel sheet once I have completed the other dashboards and sanitized the information available in the Power Pivot table.

PowerView and System Health Session– System   2 comments


Previous posts in this series:

PowerView and System Health Session–CPU health

PowerView and System Health Session–Scheduler Health

PowerView and System Health Session–SQL Memory Health

PowerView and System Health Session– IO Health

In the last post for this series, I had explained how to retrieve the I/O statistics from the System Health Session data. In this post, I will describe how to build a dashboard using the SYSTEM component of the sp_server_diagnostics output. This view will help DBAs track various errors which can get their blood pressure shooting to abnormal levels. The SYSTEM component tracks various errors like non-yielding conditions, latch related warnings, inconsistent pages detected and access violations for the SQL Server instance.

Armed with this information in a Power Pivot table, I created two calculated columns for DAY and HOUR on the time the event was reported. After that I created KPIs on the maximum number of non-yielding conditions, latch related warnings, inconsistent pages and access violations reported.

Now that I have my Power Pivot data, I created a new Power View sheet which tracks the created KPIs for each day and hour. The screenshot below shows the final view.

The first half is a 100% Stacked Bar graph showing the various errors that were reported each day. There is a slicer for Day available which allows to filter the data quickly.

The second half of the report is a matrix which shows the KPI status for which day with a drill-down capability for hour.

The third half of the report shows a card view with the actual number of issues reported for each event against a particular time.

As usual the Excel sheet is available on SkyDrive at: http://sdrv.ms/10O0udO

Issue Statistics

The query to fetch the data required to build this report is available below.


SET NOCOUNT ON

-- Fetch data for only SQL Server 2012 instances

IF (SUBSTRING(CAST(SERVERPROPERTY ('ProductVersion') AS varchar(50)),1,CHARINDEX('.',CAST(SERVERPROPERTY ('ProductVersion') AS varchar(50)))-1) >= 11)

BEGIN
-- Get UTC time difference for reporting event times local to server time

DECLARE @UTCDateDiff int = DATEDIFF(mi,GETUTCDATE(),GETDATE());

-- Store XML data retrieved in temp table

SELECT TOP 1 CAST(xet.target_data AS XML) AS XMLDATA

INTO #SystemHealthSessionData

FROM sys.dm_xe_session_targets xet

JOIN sys.dm_xe_sessions xe

ON (xe.address = xet.event_session_address)

WHERE xe.name = 'system_health'

AND xet.target_name = 'ring_buffer';

-- Parse XML data and provide required values in the form of a table

;WITH CTE_HealthSession (EventXML) AS

(

SELECT C.query('.') EventXML

FROM #SystemHealthSessionData a

CROSS APPLY a.XMLDATA.nodes('/RingBufferTarget/event') as T(C)

)

SELECT

DATEADD(mi,@UTCDateDiff,EventXML.value('(/event/@timestamp)[1]','datetime')) as [Event Time],

EventXML.value('(/event/data/text)[1]','varchar(255)') as Component,

EventXML.value('(/event/data/value/system/@latchWarnings)[1]','bigint') as [Latch Warnings],

EventXML.value('(/event/data/value/system/@isAccessViolationOccurred)[1]','bigint') as [Access Violations],

EventXML.value('(/event/data/value/system/@nonYieldingTasksReported)[1]','bigint') as [Non Yields Reported],

EventXML.value('(/event/data/value/system/@BadPagesDetected)[1]','bigint') as [Bad Pages Detected],

EventXML.value('(/event/data/value/system/@BadPagesFixed)[1]','bigint') as [Bad Pages Fixed]

FROM CTE_HealthSession

WHERE EventXML.value('(/event/@name)[1]', 'varchar(255)') = 'sp_server_diagnostics_component_result'

AND EventXML.value('(/event/data/text)[1]','varchar(255)') = 'SYSTEM'

ORDER BY [Event Time];

DROP TABLE #SystemHealthSessionData

END

Follow

Get every new post delivered to your Inbox.

Join 1,430 other followers