Initial Data Collection Script

Where would you be able to use it?

The cases where you can use this script effectively are:

1. Performance issues where a quick initial look at the general state of the server is very valuable.

2. Situations where issue is actually happening right now. However, that said, it is always good to collect post mortem data as well for further diagnostics. Performance Dashboard is a good option here but if you want a snapshot of the server and store it somewhere, then this script can help you.

3. Situations where the problem is happening and you need to quickly verify that you are not hitting some known issues.

How is this different from the other scripts that are already existing?

There are scripts already out there which do the same thing. The advantage here is the fact that the output is in HTML format with bookmarks which makes it easy to determine what information is available. Also, it is easy to get to the sections that are of interest in the current scenario.

Usage

  1. Download the script using the link given at the bottom of the page and save it to a file named INITIAL_DIAG.SQL. Open the script file in a SSMS Query Window.
  2. Press CTRL+SHIFT+F so that the output results are put into a file. Doing this will not produce a message or any other notification.
  3. Execute the script and specify INITIAL_DIAG.html as the output file name so that we can get the output in the require HTML format.

Sample Output

********************************************************************************************
INITIAL MICROSOFT PSS DATA COLLECTION
********************************************************************************************

INDEX
1. General Server Configuration
a. Server Info
b. Non-default sp_configure settings
c. Server Settings
d. Active Trace Flags
e. Profiler Trace Information
2. Memory Diagnostics
a.DBCC MEMORYSTATUS output
b.Memory Clerks rollup output
3.Procedure Cache Statistics
4.Database Diagnostics
a.Replicated Databases
b.Database File Usage Stats
c.Index Statistics
d.sp_helpdb output
e.DBCC SQLPERF (LOGSPACE)
f.Hypothetical Indexes
5.Missing Indexes report
6.Blocking Information
7.Batch/Query Statistics

########################################

****************** SERVER INFO *********
########################################

—————————————————————–
Microsoft SQL Server 2005 – 9.00.3257.00 (X64)
Jun 12 2008 16:47:07
Copyright (c) 1988-2005 Microsoft Corporation
Developer Edition (64-bit) on Windows NT 6.0 (Build 6001: Service Pack 1)

—————————————————————
SQL SERVER NAME: SERVER1

CLUSTERED: No

MACHINE NAME: SERVER1
NET BIOS NAME: SERVER1

SQL PID: 5076
FTS Installed: YES
Security Mode: MIXED
Current SPID: 51

Click here to download the script.

Modification: June 20, 2012: Added additional code to account for the following:

  • Added code to identify read-write workload ratios
  • Added code to find out non-Microsoft DLLs loaded in SQL address space using sys.dm_os_loaded_modules
  • Added code to get output for SQL Server 2012 instances

How to find out which object belongs to which filegroup

Recently I found the need to write a query which would tell me which table belonged to which File Group or Partition Scheme in a SQL Server 2005 database. I found that a system SP exists called “sp_objectfilegroup” to return the filegroup information for one table. This SP takes a parameter which is the object id of the table that you are interested in. However, using the sys.data_spaces catalog view and tying it back to sys.indexes output, I can find out which table belongs to which filegroup on the database. 

select object_name(a.object_id) as ObjectName, 

IndexName = case when a.name is null then 'Heap' else a.name end,(select name from sys.data_spaces where data_space_id = a.data_space_id) as FileGroupName 

from sys.indexes a where index_id < 2

 

I’m sure many people can find alternate ways to do so 🙂 but I thought of posting one in case I needed the same script in the future again. 

How to find out the missing indexes on a SQL Server 2008 or 2005 instance along with the CREATE INDEX commands

I had previously blogged on Missing Indexes and how useful a feature this is for troubleshooting query performance issues. Here is a script for identifying the missing indexes currently on your instance. However, you would definitely need to show prudence in implementing the indexes reported by these DMVs after a due round of testing. You would also need to make sure that nothing else is broken by implementing new indexes.  

 SELECT CONVERT (decimal (28,1), migs.avg_total_user_cost * migs.avg_user_impact * (migs.user_seeks + migs.user_scans)) AS improvement_measure, 'CREATE INDEX missing_index_' + CONVERT (varchar, mig.index_group_handle) + '_' + CONVERT (varchar, mid.index_handle) + ' ON ' + mid.statement + ' (' + ISNULL (mid.equality_columns,'') + CASE WHEN mid.equality_columns IS NOT NULL AND mid.inequality_columns IS NOT NULL THEN ',' ELSE '' END + ISNULL (mid.inequality_columns, '') + ')' + ISNULL (' INCLUDE (' + mid.included_columns + ')', '') AS create_index_statement, migs.*, mid.database_id, object_name(mid.[object_id],mid.database_id) as objectname 

FROM sys.dm_db_missing_index_groups mig 

INNER JOIN sys.dm_db_missing_index_group_stats migs ON migs.group_handle = mig.index_group_handle 

INNER JOIN sys.dm_db_missing_index_details mid ON mig.index_handle = mid.index_handle 

WHERE CONVERT (decimal (28,1), migs.avg_total_user_cost * migs.avg_user_impact * (migs.user_seeks + migs.user_scans)) > 10 

ORDER BY migs.avg_total_user_cost * migs.avg_user_impact * (migs.user_seeks + migs.user_scans) DESC 

How to purge MSDB history using T-SQL scripts

One of the Maintenance Plan Tasks available is the History Cleanup Task which can help you keep the MSDB database size in check. This Maintenance Plan Task uses the MSDB Stored Procedures sp_delete_backuphistory and sp_purge_jobhistory to cleanup the historical information inside the MSDB system tables.   

I have seen multiple scenarios where trying to delete all the historical data has proved to be fatal (because the log file grows disproportionately when the autogrow setting is set to percentage growth and due to large number of logged operations) . In such scenarios, you are left with two choices:   

1. Use the Maintenance Plan GUI to modify the oldest date for the History Cleanup Task for shorter durations and perform the cleanup by executing the job for shorter intervals.   

OR   

2. Use the script below to perform what the Cleanup Task would do.   

The script below would give a the list of commands to be executed. If you are concerned about the size of your MSDB transaction log file during the course of this purge operation, use the Checkpoint (if MSDB is in SIMPLE recovery which is the Default. Can’t think of a reason why someone would want it to be full.) piece which is currently commented in the below batch to keep the transaction log in size. Checkpoint operation under Simple Recovery model truncates the transaction log file.   

The script can be downloaded from TechNet also in case you face any formatting issues with the script above. Download Link: http://gallery.technet.microsoft.com/ScriptCenter/en-us/b3e2e276-b14a-406d-b67a-d8cbd9e1eabe   

SCRIPT   


-- Declare Local Variables

declare @mindate datetime, @oldest_date datetime, @sql varchar(8000), @currdate datetime, @oldestdate datetime -- Find out the oldest date from the Backup Set table

select @mindate = min(backup_start_date) from msdb..backupset

set @currdate = @mindate + 7

set @oldestdate = '2009-11-04 00:00:00.000' -- Modify this to the date till which you want your msdb history purged while

-- Begin a while loop to generate the commands to purge the MSDB entries

while (@currdate <= @oldestdate)

begin

set @sql = 'EXEC msdb..sp_delete_backuphistory ''' + cast(@currdate as varchar(20)) + ''''

print @sql

set @sql = 'EXEC msdb..sp_purge_jobhistory @oldest_date = ''' + cast(@currdate as varchar(20)) + ''''

print @sql

print char(13)

-- Optional if you are running out of space in MSDB

--print 'use msdb' + char(13) + 'checkpoint'

-- Increment value and move on to the next date

set @currdate = @currdate + 7 -- The time interval can be modified to suit your needs end

end

-- End of while loop

set @sql = 'EXEC msdb..sp_delete_backuphistory ''' + cast(@oldestdate as varchar(20)) + ''''

print @sql

set @sql = 'EXEC msdb..sp_purge_jobhistory @oldest_date = ''' + cast(@oldestdate as varchar(20)) + ''''

print @sql

 

 

How to import data from an Excel File using T-SQL

If you have an Excel File named ExcelFile.xls, then you can perform SELECTs on that Excel file using the Excel Sheet name and the Jet Provider.

SELECT *
FROM OPENDATASOURCE('Microsoft.Jet.OLEDB.4.0','Data Source=C:\ExcelFile.xls;Extended Properties="Excel 8.0"')...[Data_Load$];

In case, you use the same Excel file over and over again, then you can use OPENQUERY using a Linked Server defined to your Excel File.
Using OPENQUERY:

SELECT *
FROM OPENQUERY (DataLoad, 'select * from [Data_Load$]')

If you find that the data being imported for non-Text columns is NULL, then refer the following blog entry: