SQL Server 2016 Public Preview (CTP2) – Live Query Statistics

I have spent a lot of time at customer locations working on resolving SQL Server performance issues on business critical environments. This also involves helping customers understand how to track down the performance bottlenecks and the remediation steps that need to be taken to remove identified performance bottlenecks. This involves two kinds of troubleshooting: post-mortem analysis and troubleshooting the issue as and when it is happening! Yes, I am talking about live troubleshooting which is a scary thing to do on a production server.

So if you share my deep rooted passion for working on SQL Server performance issues, the Live Query Statistics feature in SQL Server 2016 CTP2 is definitely worth knowing more about!

The Live Query Statistics can be enabled in one of the following ways:

1. Using Management Studio

The screenshot below shows the button which enables the Live Query Statistics. This can be a very powerful tool to troubleshoot bad query performance while the query is actually executing. You actually get insights into the plan and which part of the plan is actually taking time while the query executes.

Live Query Stats button on toolbar

You get a host of information from the Live Query Statistics as seen in the screenshot below. You will be able to pin point the part of the plan which is the culprit because you will have the completion percentage statistics for each and every operator in the plan. The completed operators show you the efficient parts of the plan. Additionally, you also get the time spent in each operator which gives you statistics for identifying the most time consuming part of the plan. And what’s more, you even know how much of the query is completed. This is one of the most common questions that I used to receive from customers while troubleshooting long running queries: “How long will the query take to complete“? Well now, there is an answer!

image

2. Use Activity Monitor

A new grid has been added in Activity Monitor named “Active Expensive Queries” which allows you to right-click a query and click on the “Show Live Execution Plan” option. Live Query Stats button in Activity Monitor

However, the “Show Live Execution Plan” option will only be enabled if the application  or user:

  • Executes SET STATISTICS XML ON; or SET STATISTICS PROFILE ON; in the target session.

  • The query_post_execution_showplan extended event has been enabled. This is a server wide setting that enable live query statistics on all sessions

And if you are developer, then you can use this feature in conjunction with the Transact-SQL debugger and pin point slow parts of the execution plan while the query is running. A truly powerful way to write and optimize queries! The debugging experience is now enhanced as the live query plan can be used along with breakpoints! The screenshot below shows what the debugging experience would look like.

image

Do keep in mind that this feature is provided for troubleshooting/debugging slow running query scenarios and should not be kept enabled in production environments. There is a moderate performance impact for collecting and displaying the above mentioned statistics.

Reference:

Live Query Statistics
https://msdn.microsoft.com/en-us/library/dn831878.aspx 

Disclaimer: Some information in this topic is preview and subject to change in future releases. Preview information describes new features or changes to existing features in Microsoft SQL Server 2016 Community Technology Preview 2 (CTP2).

Advertisement

SQL performance gotchas for Distributed Queries

I am not sure how many of us pay attention to BOL notes under certain topics. I must admit that I have overlooked quite a few useful notes once in a while. But while working on a Linked Server permissions issue, I came across this point under the “Security for Linked Servers”:

To create the best query plans when you are using a table on a linked server, the query processor must have data distribution statistics from the linked server. Users that have limited permissions on any columns of the table might not have sufficient permissions to obtain all the useful statistics, and might receive a less efficient query plan and experience poor performance. If the linked server is an instance of SQL Server, to obtain all available statistics, the user must own the table or be a member of the sysadmin fixed server role, the db_owner fixed database role, or the db_ddladmin fixed database role on the linked server.

The reason for this is that the query optimizer is dependant on the table/index statistics to decide on the most optimal plan for the query. For remote queries, that perform a large number of joins or have complex sub-queries or complex filter conditions, you are bound to run into permission issues if the linked server remote login doesn’t have the necessary permissions to extract the statistics information. If you are not willing to give the linked server login the above rights on the remote table, then it would be better to either use an alternative means to fetch the remote data or maintain a local copy of the data using mirroring/log shipping/replication and operate on the local data to make sure that your business logic doesn’t get affected as the data increases on your server.

IMHO linked server queries should never be used for implementing complex business logic! Remote queries should be used to fetch as minimal data as possible. But Utopia is not always a reality!

Why can I not produce the same plan with a copy of the production database?

This question has been asked multiple times for SQL Server 2005/2008 environments. Well, the answer is SQL Server has become a bit smarter when it comes to the Query Optimizer. Now it takes into account the hardware environment and database session state also. (Sometimes, the optimizer may be a bit too smart for it’s own good 🙂 but that is discussion not being addressed in this thread)

To determine an optimal query plan, SQL Server 2005 and 2008 uses the following information:

  1. The database metadata – The table statistics should hold the same information i.e. same of data distribution.
  2. The hardware environment – Is the Physical Memory (RAM) and the number of Processors identical to the letter on the two machines.
  3. The database session state

Typically, you must simulate all these same types of information if you want to reproduce the behavior of the query optimizer on a test system. If you are lucky, then without 2 & 3 being satisfied, you might land up with the same plan. In scenarios where you don’t, Option 2 & 3 would be a good option to simulate before running off to solve the question:

Why is Server A generating a better plan than Server B?

With the advent of Virtualization, the simulation of the physical memory and CPU processors is not that big a deal as before. Just thought I would answer this question because many time I have been asked to explain why the same database backup doesn’t produce the same plan as Server A. The first option is to always create statistics-only copy of your database and see if you can reproduce the plan that you see on your production server. If yes, then you can safely proceed to the next step of troubleshooting the bad plan i.e. find out the most expensive part of the plan and take necessary steps to tune it like adding covering indexes, defining statistics, re-placing the join order, adding query hints etc.

Very often SQL CSS team would require a statistics clone of your database, so that they can reproduce the issue in-house. This would not contain any data from the tables but a clone of the database metadata. So, in case you need to troubleshoot a performance issue where you suspect the query plan to be the culprit, you also use a statistics clone and use that on a test server to check if you reproduce the so-called “bad” plan. The reason I mention a test environment because sometimes it is not possible to troubleshoot a query performance issue on a production server. And generating a database statistics clone for a few tables is much faster than a backup restore of the entire database.

You can use the information mentioned in the article below to create a statistics clone of your database:

How to generate a script of the necessary database metadata to create a statistics-only database in SQL Server 2005 and in SQL Server 2008
http://support.microsoft.com/?kbid=914288