Default Trace–Performance Issues

There are multiple events that a default trace in SQL Server 2005 and above tracks which can be significantly useful for finding out areas of improvement. The events that I will be concentrating on are:

1. Missing Column Statistics – This event class indicates that column statistics that could have been useful for the optimizer are not available due to which an incorrect cardinality estimation could occur. This can cause the optimizer to choose a less efficient query plan than expected. You will not see this event produced unless the option to auto-create statistics is turned off.

2. Missing Join Predicate – This event class indicates that a query is being executed that has no join predicate. (A join predicate is the ON search condition for a joined table in a FROM clause.) This could result in a long-running query. This event is produced only if both sides of the join return more than one row.

3. Sort Warnings – This event class indicates that sort operations do not fit into memory. This does not include sort operations involving the creation of indexes, only sort operations within a query (such as an ORDER BY clause used in a SELECT statement). The EventSubClass field in this event shows whether this was a single pass or a multiple pass. A single pass (EventSubClass = 1) is when the sort table was written to disk, only a single additional pass over the data was required to obtain sorted output. A multiple pass (EventSubClass = 2) is when the sort table was written to disk, multiple passes over the data were required to obtain sorted output. A multiple pass is an enemy of query performance.

4. Hash Warnings – This event class can be used to monitor when a hash recursion or cessation of hashing (hash bailout) has occurred during a hashing operation.  Hash recursion (EventSubClass = 0) occurs when the build input does not fit into available memory, resulting in the split of input into multiple partitions that are processed separately. Hash bailout (EventSubClass = 1) occurs when a hashing operation reaches its maximum recursion level and shifts to an alternate plan to process the remaining partitioned data. Hash bailout usually occurs because of skewed data. Another enemy of performance!

5. Server Memory Change – This event class occurs when Microsoft SQL Server memory usage has increased or decreased. You can even determine what is the current memory usage after the increase or decrease.

6. Log File Auto Grow – This event class indicates that the log file grew automatically. This event is not triggered if the log file is grown explicitly through ALTER DATABASE. Frequent log file growths are not food for performance.

7. Data File Auto Grow – This event class indicates that the data file grew automatically. This event is not triggered if the data file is grown explicitly by using the ALTER DATABASE statement.

Since this information is already available in the default trace, I decided to use my Default Trace Statistics Power View Excel sheet to track this information graphically. And this is what I got (see screenshot 1)!

DefaultTrace_PerfIssues

So what is the above Excel sheet displaying?

1. The information available in the first column chart will show the Data and Log file grow events per database.

2. The first matrix in the middle of the Excel sheet shows the number of Sort Warnings and Hash Warnings with drill-down capabilities for each database to see the EventSubClass fields.

3. The second matrix shows the Missing Column Statistics and the Missing Join Predicate events for each database. The drill-down capability gives the name of the column statistics that was missing.

4. The line graph shows the change in memory for the SQL Server database engine.

Happy monitoring!

Previous posts in this series:

Schema Changes History Report

Advertisement

PowerView and System Health Session–CPU health

Over the past few months, I spent a lot of time using Excel 2013 and the Power View add-in for completing various kinds of analysis on System Health Session data for SQL Server 2012. I thought it would be a good idea to create an Excel sheet with a template which would allow me to do these activities at the click of a button. I had recently written a post on how to retrieve Report Server execution statistics using Power Pivot and Power View. I will use the same philosophy to generate the CPU usage statistics using the System Health Session.

Now, you may ask why not query the data directly into an Excel sheet. The reason I do not do this is because it allows me to add calculated fields. Additionally, it allows me to write a query to fetch the data from the System Health Session that runs on SQL Server 2012 and above instances.

I had written an article on SSWUG on how to parse the scheduler_monitor_system_health_ring_buffer_recorded data and retrieve the CPU usage statistics using a SSRS report. This time around I fetched the data into a PowerPivot table. Then I added a new column for retrieving the day of the month using the DAY function. I also created a new calculated column for tracking the CPU usage of other processes using the SQL process utilization and system idle values. The new column is named “CPU usage (other processes)”.

Now that the PowerPivot table is ready, I used the created table as the source for a PowerView sheet in Excel 2013.

I created clustered column chart to track the following information:

  • Max CPU utilization for the SQL Server instance
  • Max CPU usage of all other processes
  • Average CPU utilization for the SQL Server instance
  • Average CPU usage of all other processes
  • Minimum CPU utilization for the SQL Server instance
  • Minimum CPU usage of all other processes

The clustered column chart uses the calculated Day field to make the chart more reader friendly. The second chart in the sheet is a line graph which tracks the average CPU usage for the SQL Server instance and other processes across a timeline. The slider at the bottom of the chart allows you to zoom in and out interactively of a time window that you are interested in.

Here is a screenshot of the final output as shown in screenshot 1. The Excel file with this visualization is available on SkyDrive at http://sdrv.ms/10O0udO

CPU usage statistics

Stay tuned to TroubleshootingSQL.com for more such visualizations!

Report Server Analytics with PowerPivot

I recently needed to query report execution statistics for a SQL Server Reporting Services instance where the number of rows were quite large. So I decided to use Excel 2013 and PowerPivot to analyze the data. This would allow me to directly pull all of the data into an Excel sheet and then perform analysis on the extracted data without having to query the Report Server database repeatedly.

The first thing to do is to create a connection to your Report Server database table. That can be done using the PowerPivot tab in Excel 2013. Click on the Manage button (Screenshot 1).

image

Once you are PowerPivot window, you will need to establish a connection with your Report Server database. This can be done using the From Database drop-down list with the From SQL Server option (Screenshot 2).

image

Create a connection to your SQL Server instance which hosts the Report Server database using the Table Import Wizard. In the Choose How to Import Data screen, select the second option i.e. Write a query that will specify the data to import. Once you click Next, provide a friendly name for the query and copy-paste the query shown below.

 select
b.Name,a.UserName,
CASE(a.RequestType)
WHEN 0 THEN 'Interactive'
WHEN 1 THEN 'Subscription'
WHEN 2 THEN 'Refresh Cache'
ELSE 'Unknown'
END AS RequestType,
a.Format, a.InstanceName,
CASE(a.ReportAction)
WHEN 1 THEN 'Render'
WHEN 2 THEN 'BookmarkNavigation'
WHEN 3 THEN 'DocumentMapNavigation'
WHEN 4 THEN 'DrillThrough'
WHEN 5 THEN 'FindString'
WHEN 6 THEN 'GetDocumentMap'
WHEN 7 THEN 'Toggle'
WHEN 8 THEN 'Sort'
WHEN 9 THEN 'Execute'
WHEN 10 THEN 'RenderEdit'
ELSE 'Unknown'
END AS ItemAction,
a.TimeStart,a.TimeEnd,a.TimeDataRetrieval,a.TimeProcessing,a.TimeRendering,a.[Status],a.ByteCount,a.[RowCount]
from dbo.ExecutionLogStorage a
inner join dbo.[Catalog] b
on a.ReportID = b.ItemID 

Now data obtained from the query will be imported into the PowerPivot table. Now I create a new PowerView sheet in the Excel workbook. Add a new column to add the TimeDataRetrieval, TimeProcessing and TimeRendering columns so that the total report duration is available at a single glance.

Now save the PowerPivot data and add a new PowerView Excel sheet called “Execution Stats”. This would allow you to create a view as shown in Screenshot 3.

image

Leverage PowerView’s interactive capabilities to view the Report execution statistics, the report processed status (failure or success) and the processing, rendering and data retrieval times for each report.

I have uploaded the file to SkyDrive which would allow you to connect to your own Reporting Services database and use the same report format for analysis. Stay tuned for additional PowerView sheets to give you various slicing and dicing options on report execution statistics.

Excel File Download Link: http://sdrv.ms/17ZmbK0 (Filename: SSRS_Exec_Stats.xlsx) Works with Excel 2013 only.

Another saga for corrupt images

I had blogged in the past on how to track corrupt images while rendering a SSRS report containing images. In this post, I shall talk about generating reports for reports for images which have a corrupted JPEG stream.

If you have an image which has a corrupt JPEG stream, then the report will render without any issues and an export of the report to a PDF document will report no errors. However, when you open the PDF document, you will be greeted with the error seen in the screenshot below reporting “An internal error occurred”:

image

Now this can be a bit disconcerting when you are automating your report generation and are not aware of any exceptions reported. Interestingly when this report is rendered in MHTML format, the file can be opened without issues but you still see the corrupt image stream. The crux of the matter lies in how PDF documents are rendered and when a corrupt JPEG stream is encountered. This issue manifests itself while creating a PDF document with such a corrupt image stream outside of SQL Server Reporting Services.

Now that we know that there are image coding standards at play here and nothing much can be done, what are the options.

For the curious mind, the image when viewed in a picture viewing application shows the corrupt stream as seen below:

image

The first option is to use an image processing executable and detect all images that you will be processing in your report before you render it into a PDF format.

Image Magick DLLs and binaries are one such option. You could either choose to use the binaries that are available on the site or you could create your own picture verification package using their DLLs available on the site. Their code is available on CodePlex as well. The binaries available on their site has an executable called “Identify”. This can be used to identify corrupt image streams. The output of the tool when used to evaluate an image will be as follows:

identify.exe: Corrupt JPEG data: bad Huffman code `C:\Tempdb\1.jpg’ @ warning/jp
eg.c/JPEGWarningHandler/346.
identify.exe: Corrupt JPEG data: 20 extraneous bytes before marker 0xd9 `C:\Temp
db\1.jpg’ @ warning/jpeg.c/JPEGWarningHandler/346.

The other way to workaround this issue would be to use Custom Code within your report to re-save the image. This doesn’t remove the corrupt image stream but the image snippet that you see in the screenshot above now becomes part of your image. This will help you generate your PDF documents correctly and open them for viewing later. However, if you want to correct the corrupt JPEG stream, then you will need to store the correct image in your database.

Steps for using custom code to prevent PDF opening failure

1. Paste the VBCode below in the Code section of the Report.

' From the data base the data is returned in the form of byte  array. So we’re writing a function that accepts the byte array as parameter.

' Also it is going to return the new image as byte array.

Public Shared Function ConvertImage(ByVal imgData As Byte()) As Byte()

'Stream to hold the source image (jpeg).

Dim msSource As New System.IO.MemoryStream()

'Stream to hold the converted image (png).

Dim msDest As New System.IO.MemoryStream()

' Reading the image from byte array to the source stream.

msSource.Write(imgData, 0, imgData.Length)

msSource.Flush()

'Image object that will store the image extracted from the source stream.

Dim imgSource As System.Drawing.Image = System.Drawing.Image.FromStream(msSource)

'Saving the New, Converted Image in to the destination stream.

imgSource.Save(msDest, System.Drawing.Imaging.ImageFormat.Png)

'Converting the New stream into Byte array, to be used inside the reporting service image control.

Dim imgDest(msDest.Length) As Byte

imgDest = msDest.GetBuffer()

Return imgDest

End Function

2. Now click on the References tab, Under Assembly Name, click on the ellipses button “…”.

3. Locate and select System.Drawing.dll, Click on Add and then Click Ok.

4. Now go to the report, Image control Properties and click on the Value. which would look something similar to this =First(Fields!Photo.Value, “DataSetName”)

5. Replace the above value with the expression, =Code.ConvertImage(First(Fields!Photo.Value,”DataSetName”)).

6. Export the report to PDF and you will now notice that the previous “internal error” is not reported.

Since you are adding custom code to your report to re-save your binary image stream, you will incur an additional processing overhead while generating the reports. If your PDF contains a large number of images, then you will have to factor in the additional resource and time consumed to perform this sanitization process!

IMHO using the first option of checking the correctness of the JPEG stream using a separate executable would be much more feasible in case you are processing 1000+ images in a single report.

A special shout-out to my colleagues Selva [Blog] and Arvind [Blog] for helping me out on this issue!

Disclaimer: The Sample Code is provided for the purpose of illustration only and is not intended to be used in a production environment. THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. Image Magick is a third-party utility which has been referenced in this post. Please refer documentation on the Image Magick website regarding licensing, warranty and usage before implementing the use of the same in your environment.

 

Oh Image: Art thou valid?

Last week I had written a post on how to determine corrupt binary streams of images stored in a database: The case of the corrupt images. The main reason of that post was to determine the corrupt binary streams in the database before the SSRS reports accessing the inconsistent data while rendering reports and turning green in the face! So if you don’t have control over what is getting stuffed in the database (which is seriously a bad situation to be in), then here is another option for you if you are ready to use Custom Assemblies or Custom Code. I have decided to use a Custom Assembly for illustrating what I mean in this blog post. Ideally the image validation check should happen on the front end application/website from where the image is being uploaded into the database.

I am going to use a simple report which pulls a set of images from a database and displays them. I have two versions of the same report. The first report “Image_NoVerify” doesn’t use the custom assembly where as the “Image” report uses the fnImageValid function defined in VerifyImage.dll to determine the validity of the image.

The source code for the custom assembly is shown below:

/*

This Sample Code is provided for the purpose of illustration only and is not intended to be used in a production environment. THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. We grant You a nonexclusive, royalty-free right to use and modify the Sample Code and to reproduce and distribute the object code form of the Sample Code, provided that You agree: (i) to not use Our name, logo, or trademarks to market Your software product in which the Sample Code is embedded; (ii) to include a valid copyright notice on Your software product in which the Sample Code is embedded; and (iii) to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of the Sample Code.

*/

using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using System.Drawing;

using System.IO;

using System.Security.Permissions;

namespace VerifyImage

{

public static class Class1

{

public static String fnImageValid(byte[] vImgStream)

{

// The file name to store the image at

String vFilePath = @"C:\1.jpg";

// File system permissions asserted to prevent security exceptions while saving the file

FileIOPermission filePerm = new FileIOPermission(System.Security.Permissions.PermissionState.Unrestricted);

filePerm.Assert();

// Check for nulls

if (vImgStream == null)

return "NULL image";

// Store the image on the filesystem

FileStream fs = new FileStream(vFilePath, FileMode.Create, FileAccess.Write);

try

{

// Zero KB image detection

if (vImgStream.GetUpperBound(0) > 0)

{

fs.Write(vImgStream, 0, vImgStream.GetUpperBound(0));

}

else

{

fs.Close();

return "ZERO KB Image detected";

}

}

catch (Exception)

{

fs.Close();

return "Unable to write image to disk";

}

fs.Close();

// Check image saved on filesystem for validity

try

{

using (FileStream fsRead = new FileStream(vFilePath, FileMode.Open, FileAccess.Read, FileShare.Read))

{

using (StreamReader sr = new StreamReader(fsRead))

{

try

{

try

{

// Try and check if the seek is possible. If it fails, then we have problems!

fsRead.Position = 0;

fsRead.Seek(0, SeekOrigin.Begin);

}

catch { }

using (Image img = Image.FromStream(fsRead))

{

// IF there are no errors, then report success

img.Dispose();

fsRead.Close();

return "Valid Image";

}

}

catch (Exception)

{

fsRead.Close();

return "Corrupt Image Detected";

}

}

}

}

catch (Exception Ex)

{

return "Exception Caught: " + Ex.Message.ToString() + ":" + Ex.StackTrace.ToString();

}

}

}

}

You would also need to modify the rssrvpolicy.config file to ensure that the assembly doesn’t fail with permission errors when it is used while executing the report. Note that the values mentioned below is to get this working. You might need to make additional changes for making the security permissions more granular to satisfy the security requirements in your environment.

 <CodeGroup class=”UnionCodeGroup”
version=”2″
PermissionSetName=”FullTrust”
Name=”MyCodeGroup”
Description=”Code group for VerifyImage extension”>
<IMembershipCondition class=”UrlMembershipCondition”
version=”2″
Url=”C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\bin\VerifyImage.dll”
/>
</CodeGroup>

imageWhen the Image report is saved as a PDF file, you will see the following output for a valid image. The text box uses the following expressionimage to display the output shown in the screen shot on the left:

=Is Image Valid: ” + Fields!SNO.Value.ToString() + VerifyImage.Class1.fnImageValid(Fields!Document.Value).ToString()

If the image is inconsistent, then you will see one of the following outputs shown in the screenshots on the right. You can extend this further by displaying the image only if the function fnImageValid returns true. If the function returns a false, then you can use a stock image to let the user know that image is corrupt. This way you can prevent rendering failures.

imageAs you can see from the data retrieved from the ExecutionLogStorage table in the ReportServer database that the image verification check at times can increase the execution time of the report. Since I am using only a few images to create the report, the execution times in the table mentioned below are comparable. However, if the images are larger or the number of images are more, then the report execution time will increase due to additional validation check processing which wouldn’t necessarily be a linear increase in execution time.

Note that invalid image streams when used in SQL Server 2008 Reporting Services and lower releases results in exceptions being thrown by Reporting Services and causes mini-dumps to be generated.

The DLL was placed in the following locations for the above example to work:

C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\bin
C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\PrivateAssemblies

I had achieved this using a Post Build event which would automatically copy the latest version of the DLL to the two above mentioned folders. This beats having to manually copy paste the DLL every time I build it.

image

References:

Custom Code and Assembly References in Expressions in Report Designer (SSRS)
http://technet.microsoft.com/en-us/library/ms159238.aspx

How to use custom assemblies or embedded code in Reporting Services
http://support.microsoft.com/kb/920769