WOOT: Schema Changes History Report on Power View


The last post in this series talked about using Power View to analyze the data stored in the SQL Server’s default trace. I decided to take this a step further by creating the Schema Changes History report with the help of the data that I retrieved from the Default Traces. The advantage of a report created in Power View is that the interactivity which is missing in the standard report is available.

The way I created this report was to filter the data in the Power Pivot table using EventClass ID 46, 47 and 164 for only looking at the create, drop and alter commands which the default trace tracks. After that I created a table with a tile on the Database Name and a 100% Stacked Bar Chart to show the activity at a database level.

I also had to create linked tables for getting the Object Type and the Event Class Name that you see in the table below.

I will provide a final version of the Excel sheet once I have completed the other dashboards and sanitized the information available in the Power Pivot table.

SchemaChangesHistory

Previous Post in the Series:

Default Trace Dashboard
https://troubleshootingsql.com/2013/09/26/woot-default-trace-and-power-view/

Advertisements

WOOT: Default Trace and Power View


I have been working on building visualizations for various kinds of analysis that I perform for my customers. One such useful visualization was the use of Power View for analyzing the data available in the SQL Server Default Trace. The query below lets you retrieve all the information in the default traces. This same query is used to populate the Power Pivot table in the Excel file.


declare @enable int

-- Check to find out if Default Server Side traces are running
select top 1 @enable = convert(int,value_in_use) from sys.configurations where name = 'default trace enabled'

if @enable = 1 --default trace is enabled
begin

declare @d1 datetime;
declare @diff int;
declare @curr_tracefilename varchar(500);
declare @base_tracefilename varchar(500);
declare @indx int ;

select @curr_tracefilename = path from sys.traces where is_default = 1 ;

set @curr_tracefilename = reverse(@curr_tracefilename)
select @indx = PATINDEX('%\%', @curr_tracefilename)
set @curr_tracefilename = reverse(@curr_tracefilename)
set @base_tracefilename = LEFT( @curr_tracefilename,len(@curr_tracefilename) - @indx) + '\log.trc';

select EventCat.name as Category, EventID.name as EventName, Events.*
from ::fn_trace_gettable( @base_tracefilename, default ) Events
inner join sys.trace_events EventID
on Events.EventClass = EventID.trace_event_id
inner join sys.trace_categories EventCat
on EventID.category_id = EventCat.category_id

end

Once I have all the trace data available in my Power Pivot table, I created calculated columns for Day, Hour and Minute. Now that I have all the data readily available for me, I went about creating the main dashboard which provides a view of all the events that occurred along with a time line view. All this took me less than 5 minutes after I had finished writing the query! Pretty quick. Now I have an interactive report that I can use for performing various kinds of analysis.

The screenshot below will show that there was only one event raised for the Server event category and the actual time of occurrence is shown in the line graph. A simple mouse over on the point will give you the exact details. Now isn’t that a simple way to track down events! Smile

image

I will provide a final version of the Excel sheet once I have completed the other dashboards and sanitized the information available in the Power Pivot table.

Tools Tips and Tricks #4: RML Utilities


It is time again for your friendly neighborhood (or virtual neighborhood) SQL CSS person to provide you with the next set of tips for the various tools that he uses in his day-to-day work. I probably use this tool “RML Utilities” or the database created by this at least once a day and sometimes even more. Not kidding!!

This tool has become synonymous with Profiler Trace analysis collected from customer environments in CSS. Majority of you might be familiar with the tool “SQL Nexus” which call RML utilities in the background to load the collected Profiler Traces into a SQL Server database. This calls an executable, ReadTrace.exe. In this post, I will talk about a few conditions when you should be calling ReadTrace.exe instead of invoking it from SQL Nexus.

image

As you can see above that SQL Nexus provides you the option of breaking down the activity of each SPID into individual .trc files but they can only be directed to the %TEMP%\RML folder on your machine. The previous server that I had didn’t have sufficient disk space to hold over 100GB of data in the TEMP directory. Yes, we do analyze data which runs into a few hundred GBs at times!

Now you have two options, change the %TEMP% environment path on your machine to a different drive or use the ReadTrace.exe to generate the .trc files in the required path.

Another option that cannot be controlled from SQL Nexus is the ability to add SPID, Hostname and Application filters while importing the data.

Some of the defaults for ReadTrace are shown below which are required for any profiler trace import are:

-o Output files to be generated in the current directory
-S Connect to default SQL Server instance on the machine
-d Load profiler data into PerfAnalysis database
-E Use windows authentication while connection.

Here are a few scenarios where I use ReadTrace.exe directly….

Analyzing a deadlock scenario where I know the Session IDs involved

Sometimes the deadlock graph is not sufficient to figure out why a lock is still held by that session and why that session acquired the lock on the object in the first place. At that point, you need to track down the sequence of statements executed by the session. In this case, I will use the following command to import the data.

readtrace -S. -E -ddbRMLUtilDemo
-I"D:\RML\RMLUtil_demo.trc" -A"!SQLCMD" -s53 -s55 -MF -o"D:\RML\output" -f

The above command will import the trace files into a database dbRMLUtilDemo on the default instance on the machine with a SPID filter of 53 and 55 and exclude events generated with the application name SQLCMD. Additionally, individual .trc files will be generated containing the activity of SPIDs 53 and 55.

Generating .RML files to be used by OStress

Use the –o parameter to change the output path and use the –s, –A and –H filters to apply the necessary SPID, application name and host name filters respectively. An example to apply filters for SPIDs 51, 52, 53, exclude events generated by application name PSSDIAG or hostname MyLaptop.

readtrace -S. -E -ddbRMLUtilDemo
-I"D:\RML\RMLUtil_demo.trc" -A"!SQLCMD" –s51 –s52 -s53 -o"D:\RML\output" –H"!MyLaptop" –A"!PSSDIAG"

The command line window and the ReadTrace.log file available in the output directory will confirm if the filters were applied correctly or not. Below is a screen shot of the command line window showing that the filters were applied correctly:

image

Handling MARS sessions

ReadTrace has limited processing ability for handling sessions using MARS (Multiple Active Result Sets). You have to use –f and –T35 to import data from trace files which contains MARS sessions. ReadTrace uses the formula ((Session Id * 10 ) + BatchId) so that a MARS sessions appears as a separate session. Eg. If you see Session ID 1620, 1621 and so on in the ReadTrace/SQLNexus database but cannot find such a session in the DMV outputs captured during that time, then the activity corresponds to Session ID 162 which was using MARS.

Note: And remember to always use a Server Side trace whenever possible to capture a profiler trace output to a local disk.

The PDF help files for RML Utilities are located at <system drive>:\Program Files\Microsoft Corporation\RMLUtils\Help folder by default on a machine where RML Utilities is installed in case you wanted more information on the above.

RML Utilities Download
http://blogs.msdn.com/b/psssql/archive/2008/11/12/cumulative-update-1-to-the-rml-utilities-for-microsoft-sql-server-released.aspx

Keep following the tag “Tools tips and Tricks” for more posts on the various tools that we use. That’s it for today!