SQL Saturday 511: Extended Events and more

imageExtended Events have evolved a lot since they were first introduced in SQL Server 2008. And today, there are multiple extended events available in SQL Server which allow you to debug common scenarios without collecting having to collect memory dumps or diagnostic data which can be detrimental to your SQL Server instance’s performance.

On April 2nd, join me at Redmond to understand how to leverage the new extended events that are available in SQL Server 2012 Service Pack 3 and above in your environments. I will talk about common scenarios where the new extended events are available and provide canned scripts to help collect and analyze data for complex scenarios.

There are a host of SQL experts and as well as folks from the Microsoft SQL Server product group who will be at the event to present and answer questions that you have! Looking forward to meet the SQL Community at the event!

WHEN: April 2nd, 2016, 2.15 – 3.15 PM PST
SESSION: Troubleshooting made easier using Extended events

Details about the session are available here. The full schedule of the event is available here.

The slide deck that I used during the session is available below.

GOTCHA: Executing powershell scripts using scheduled tasks

This is another gotcha for setting up scheduled tasks which execute PowerShell scripts. I have a SQL Server instance installed on an Azure virtual machine. I am using a D-Series machine which allows me to store my tempdb files on the D: drive which is a SSD drive. However, the D: drive on Azure virtual machines is not a persistent drive. If you have change the drive letters on your Azure VM, then you can use the PowerShell script in my earlier blog post to identify the temporary drive. So, when the Azure virtual machine restarts the D: drive is re-created and all my folder structure is lost. I already have a scheduled task created on my Azure virtual machine which re-creates the folder structure on the D: drive. The blog post in the reference section has more details on how to achieve this.

However, when the scheduled task executes, the following error is reported in the Task Scheduler logs.

image

The last run result is reported as 0xFFFD0000 and the task history would show the following message:

Task Scheduler successfully completed task “\Tempdb Folder Creation” , instance “{35ec7a4f-6669-437f-b12f-40b95689896c}” , action “C:\Windows\System32\WindowsPowerShell\v1.0\powershell.EXE” with return code 4294770688.

First, let us find out why this error message gets reported. If you have setup a PowerShell script to execute the script in the following manner using single quotes, then this issue occurs.

powershell -file ‘C:\Automation\TempdbFolder.ps1’

If you have not used an output file, then you will find that this might not be an easy thing to troubleshoot. If you execute the above command from a command prompt window, you will find the following error reported:

Processing -File ”C:\Automation\TempdbFolder.ps1” failed: The given path’s format is not supported. Specify a valid path for the -File parameter.

Changing the command to use double quotes would make it execute without any issues.

Some other things to keep in mind when creating the scheduled task would be:

1. Using a full qualified path to the script file rather than a relative path.

2. Ensuring that the account running the script has the correct privileges.

3. The task should be configured to run without having the user logged in.

Reference:
Using SSDs in Azure VMs to store SQL Server TempDB and Buffer Pool Extensions
http://blogs.technet.com/b/dataplatforminsider/archive/2014/09/25/using-ssds-in-azure-vms-to-store-sql-server-tempdb-and-buffer-pool-extensions.aspx

Lessons learnt while using the Cloud Adapter

During the last week of August, I had blogged about how to get your on-premise database to your SQL Server instance running on an Azure virtual machine. I had run into a few issues while trying to run the wizard provided by Management Studio.

The First Stumble

This error is easy to circumvent and pretty much mentioned in the online documentation. The error message would read as:

Failed to locate a SQL Server of version 12.0.2000 or later installed on the remote machine. Please verify that a SQL Server of the same or higher version than the source SQL Server is installed on the remote machine.

The above error is self-explanatory. There is a requirement that the source database engine version be lower or equal to the version of the SQL Server instance running on Azure. Eg. You cannot deploy a database from a SQL Server 2014 instance to a SQL Server 2012 instance running on an Azure VM.

The Second Stumble

The second common error that you might run into is:

The Cloud Adapter port configuration is not valid. Verify the virtual machine endpoint configurations.

The above error will be encountered when the endpoint is not configured for the Azure virtual to accept connections from the outer realm! This can be easily rectified by adding a TCP endpoint to your Azure virtual machine for 11435 which is the port that the SQL Server Cloud Adapter Service is listening on. This is also mentioned in the online documentation. Once you have created the endpoint for your Azure virtual for your on-premise server to connect with the Cloud Adapter service, your endpoint configuration should look like the one in the screenshot below:

image

The Third Stumble

The next issue could be with permissions/authentication or it might not be as easy as it seems.

Cloud Adapter operation failed due to invalid authentication. Verify the virtual machine name, user name, and password.

So the first thing to check if you have the correct account name and password. If it is due to an authentication error, then the application event log of the Azure Virtual Machine will show the following error with the source as SQL Server Cloud Adapter service as shown in Screenshot 2. The text of the error message is mentioned below.

Access denied for user <user name>

image

The other error that you might encounter is when the SQL Server Cloud Adapter service tries to enumerate the database engines installed on the virtual machine. The error would still be talking about the authentication which is reported by the management studio wizard but a little investigation into the application event logs of the virtual machine will show the following error:

[Error] <ip address> Exception in GetSqlInstances(): SQL Server WMI provider is not available on <machine name>.. Stack trace:    at Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer.TryConnect()
   at Microsoft.SqlServer.Management.Smo.Wmi.WmiSmoObject.get_Proxy()
at Microsoft.SqlServer.Management.Smo.Wmi.WmiSmoObject.EnumChildren(String childTypeName, WmiCollectionBase coll)
at Microsoft.SqlServer.Management.Smo.Wmi.ServerInstanceCollection.InitializeChildCollection()
at Microsoft.SqlServer.Management.CloudAdapter.Tasks.GetSqlInstances()
at Microsoft.SqlServer.Management.CloudAdapter.CloudAdapter.GetSqlInstances(String username, String password). Inner Exception: Invalid namespace .

The above error clearly states that the GetSqlInstances() method failed. Microsoft.SqlServer.Management.Smo.Wmi namespace contains classes that provide programmatic access to the Windows Management Instrumentation (WMI) from an SMO application. I had talked about needing the shared management objects in an earlier post. The SQL Server 2014 WMI provider is also required which is available by installing the client connectivity components from any SQL Server 2014 setup including SQL Server Express. The components that I had installed were:

a. Client Tools Connectivity

b. Client Tools Backwards Compatibility

If you are not sure if you have the WMI provider, then look for the file “C:\Program Files (x86)\Microsoft SQL Server\120\Shared\sqlmgmproviderxpsp2up.mof“. This is the SQL Server 2014 MOF file. Another way to test if the WMI provider is working without running the wizard every time and have it fail is to run the PowerShell commands below on your Azure Virtual Machine. This script will tell you where the instance enumeration being performed by the deployment wizard will work or fail.

[System.reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SqlWmiManagement")
$m = New-Object ('Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer') '.'
foreach ($svi in $m.ServerInstances)
{
	$svi.Name;
}

This post was intended to document that common issues that you might run into while deploying a database from an on-premise SQL Server instance to a SQL Server instance running on an Azure Virtual Machine.

Installing the SQL Server Cloud Adapter Service

The Cloud Adapter is a stateless, synchronous service that receives messages from the on-premise instance of SQL Server. This service is required when you are deploying a database from your on-premise SQL Server instance to a SQL Server deployed on an Azure Virtual Machine.

Cloud Adapter is supported with SQL Server 2012 and higher. On SQL Server 2012, the Cloud Adapter for SQL Server requires SQL Management Objects.

For your SQL Server 2012 installation, you will need the SQL Server Cloud Adapter to be installed. This is available for download from the SQL Server 2014 Feature Pack. The filename that you need to download is SqlCloudAdapter.msi.

When you try to install this on your Azure VM, you might end up with the error message below:

Service cannot be started. System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.SqlServer.SqlEnum, Version=12.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91’ or one of its dependencies. The system cannot find the file specified.
File name: ‘Microsoft.SqlServer.SqlEnum, Version=12.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91′
   at Microsoft.SqlServer.Management.CloudAdapter.Service.CloudAdapterService.ReadConfigurationParameters()
   at Microsoft.SqlServer.Management.CloudAdapter.Service.CloudAdapterService.OnStart(String[] args)
   at System.ServiceProcess.ServiceBase.ServiceQueuedMainCallback(Object state)

The above error message clearly states that the version number that the installer is looking for is SQL Server 2014 i.e. version = 12.0.0.0. You can install this assembly when Microsoft® SQL Server® 2014 Shared Management Objects (SharedManagementObjects.msi) is installed from SQL Server 2014 Feature Pack.

Reference:
Cloud Adapter for SQL Server
http://msdn.microsoft.com/en-us/library/dn169301.aspx

T-SQL Tuesday #49: Why was SQL waiting?

A lot of my customer engagements involve identifying waits experienced by SQL Server queries on a production environment. Since this month’s topic is WAITS… I see this as a good opportunity to talk about how I go about the analysis of the collected data to identify the blocking chains.

Before I start rambling off, a note of thanks to my friend Robert Davis [B|T] for hosting this month’s T-SQL Tuesday! He couldn’t have chosen a better topic! 🙂

In this post, I will be talking about how to slice and dice the diagnostic data collected by SQL Perf Stats script that is imported into a SQL Nexus database.

The tables of interest in the SQL Nexus database are:

1. tbl_Requests – This table holds the combined output of the snapshots collected by the Perf Stats script from the sys.dm_exec_requests, sys.dm_exec_sessions, sys.dm_exec_connections and sys.dm_os_tasks

2. tbl_NotableActiveQueries – This table holds the output sys.dm_exec_sql_text and sys.dm_exec_query_stats snapshots collected by the Perf Stats script.

The file that needs to be present when SQL Nexus is imported the collected diagnostic data is <server name>__SQL_2008_Perf_Stats_Startup.OUT. The plethora of information available in this file can be mind boggling at times. So pushing all this information into a database makes a lot of sense.

The first order of the day is to determine if there was actually blocking experienced during the data collection. The output of the query will give you collected snapshots where blocking was detected.

<span style="font-size: small;">select runtime,count (*) as blocked_processes
from tbl_requests
where blocking_session_id > 0 and session_id <> blocking_session_id
group by runtime
having count(*) > 1
order by runtime

The next order of the day is to find out which sessions were blocked. The SQL Nexus reports do a good job of drilling down into each blocking session observed. But if you want a collated view of all the blocking along with the list of head blockers, then the following T-SQL script could prove useful especially when the .OUT file being imported is a few hundred MBs in size! This would help you slice and dice the MBs or GBs of data that was imported into the SQL Nexus database tables and get a consolidated view of the blocking chains.

/*
This Sample Code is provided for the purpose of illustration only and is not intended to be used in a production environment. THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. We grant You a nonexclusive, royalty-free right to use and modify the Sample Code and to reproduce and distribute the object code form of the Sample Code, provided that You agree: (i) to not use Our name, logo, or trademarks to market Your software product in which the Sample Code is embedded; (ii) to include a valid copyright notice on Your software product in which the Sample Code is embedded; and (iii) to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of the Sample Code.

Author: Amit Banerjee

Modified: December 10, 2013

Description:

This T-SQL script extracts information about the blocking chains from the SQL Nexus database

*/

set nocount on
print '================== Blocking Analysis ====================='
print ''
print 'Runtime Information'
select min(runtime) as [Start Time], max(runtime) as [End Time] from tbl_Requests

if ((select count(*) from tbl_requests where blocking_session_id <> 0) > 0)
begin
print 'Blocking found'
print ''
print '============= Blocking Chains (Runtimes considered where blocking_session_id processes > 5) ==================='
-- Show a summary of the # of blocked sessions for every snapshot collected by the PerfStats script
select runtime,count (*) as blocked_processes
from tbl_requests
where blocking_session_id > 0 and session_id <> blocking_session_id
group by runtime
having count(*) > 1 -- The threshold can be changed as appropriate
order by runtime

print ''
print ''
print '================= Head Blocker Information ==================='
if (object_id('tmp_runtimes') <> NULL)
begin
drop table tmp_runtimes
end

-- Create a table to store the runtimes
CREATE TABLE tmp_runtimes(
[row_num] [bigint] IDENTITY(1,1) NOT NULL,
[runtime] datetime
) ON [PRIMARY]

-- Create a list of the distinct runtimes
insert into tmp_runtimes
select distinct (runtime)
from tbl_requests
where blocking_session_id <> 0
order by runtime

-- Start a while loop to fetch the data for the head blocker and blocked sessions
declare @count int,@loop int,@runtime datetime
select @count = count(*) from tmp_runtimes
set @loop = 1
while (@loop <> @count)
begin
select @runtime = runtime from tmp_runtimes where row_num = @loop

print '~~~~~~~~~~~~~~~ Runtime: '
select @runtime
print ''
print '~~~ Head Blocker'

select a.session_id,a.ecid,a.blocking_ecid,wait_type,wait_duration_ms,command,b.dbname,b.procname,b.stmt_text
from tbl_requests a
left outer join tbl_notableactivequeries b
on a.runtime = b.runtime and a.session_id = b.session_id
where blocking_session_id = 0 and a.runtime = @runtime and a.session_id in
(select blocking_session_id
from tbl_requests where blocking_session_id <> 0 and runtime = @runtime)

print ''
print '~~~ Blocked Session Information'
select *
from tbl_requests a
left outer join tbl_notableactivequeries b
on a.runtime = b.runtime and a.session_id = b.session_id
where a.blocking_session_id <> 0
and a.runtime = @runtime
order by a.session_id,a.blocking_session_id

set @loop = @loop + 1

end

drop table tmp_runtimes

end

else
begin
print 'No Blocking found'
end

set nocount off

With the help of the above output, you can get a collated view of all the blocking chains encountered during the data collection period. CSS engineers are Microsoft use SQL Nexus extensively for analyzing diagnostic data collected for SQL Server performance issues. I hope you find this script useful in analyzing blocking chains and do keep the feedback coming in. The above script is available for download on SkyDrive as well.

Please note that the PerfStats script collects data at 10 second intervals by default! If your blocking chains are for shorter durations then you will need to change the waitfor delay interval. If your blocking duration is lesser than a second, then data collection using the DMVs would not be of much help and you would need to go down the path of Extended Events tracing or other invasive forms of data collection (like XPerf utility).