Statistics Update

This is an important task in terms of database maintenance. This ensures the statistics are up-to-date which in turn would ensure that the query optimizer doesn’t land up with sub-optimal plans. The two ways to monitor for the need for an update on statistics are:
1. STATS_DATE function

2. rowmodctr value in the sysindexes (For SQL 2000) and sys.sysindexes (For SQL 2005) output

The STATS_DATE function would give you the date when the statistics were updated last. The rowmodctr value would tell you how many changes have taken place in terms of update, inserts and deletes in the data of the column the index is associated with. However, this counter is not applicable for SQL Server 2005 as SQL Server 2005 and higher versions use Column Modifications to track modifications on tables and based on the same, the statistics are updated if AUTO UPDATE STATS property of the database is set.

The frequency of statistics update depends on the following:
1. Do BULK operations happen on the database tables in question?
2. Is there an off-peak period where a statistics update with full scan can be done?
3. How often is the data modified in the database tables in question?

Statistics update can be a resource intensive operation depending on the size of the table. If the data in the tables in question change very rarely, then a statistics update with a full scan can be done during a maintenance window. Statistics update is always an ONLINE operation and doesn’t cause the database to be in an OFFLINE mode.

If there are BULK operations happening in a table, then statistics have a tendency to getting skewed after the BULK operation. The best option is do perform a statistics update on this table if this tables is to be used by an application after the bulk operation.

If the database is updated very frequently and the database is very large in size, then it needs to be determined which tables are most frequently updated. Based on this, a statistics updated can be performed for only those tables and a statistics update can be done for the entire database with a lower sampling rate like 20-30% depending on what is suitable. This can be determined by comparing historical data and finding out what kind of sampling rate is suitable for your needs.

Another option is to enable AUTO UPDATE STATISTICS option for the database. But it needs to be monitored whether this is a boon or a bane. AUTO UPDATE STATISTICS can be good when the statistics are relatively updated and it makes sure that the statistics don’t fall too out of place. This feature has a downside when the statistics become out-dated too frequently, then you would have AUTO UPDATE STATISTICS being fired every time and this would cause all cached plans associated with the rows whose statistics have been updated to be recompiled. This can cause a serious bottleneck if there are too many auto update statistics events fired in short time span.

The following methods can be used to update the statistics:
1. UPDATE STATISTICS WITH FULLSCAN (or a lower sampling rate) Please refer the following article for further details: http://msdn2.microsoft.com/en-us/library/aa260645(SQL.80).aspx

2. sp_updatestats

3. A maintenance plan or SQL Agent job to update the statistics
The following script for SQL Server 2005 would be helpful in determining how badly affected the statistics are for the index associated with it. The rowmodctr counts the total number of inserted, deleted, or updated rows since the last time statistics were updated for the table. Again this is applicable for SQL Server 2000 only.

A DBCC DBREINDEX would update the statistics associated with those indexes. But this would not update the AUTO CREATED STATISTICS. The DBCC SHOW_STATISTICS command could also help you to determine the statistics condition for the particular index. The “Row Sampled” and “Rows” if equal would indicate that the sampling is currently 100%.

Modification: May 13, 2011:

I had earlier stated that a DBCC DBREINDEX wouldn’t update the auto-created statistics but this is incorrect. The current behavior is that the auto-created statistics and manually created column statistics are not updated only when ALTER INDEX..REBUILD ALL is used. However, when you use DBCC DBREINDEX, it would update the statistics for the manually and auto-created statistics with the default sampling rate. 


Statistics for INDEX ‘pk_customers’.

Updated Rows Rows Sampled Steps Density Average key length

Jun 23 2007 5:03PM 91 91 91 1.0989011E-2 10.0


The following articles could be helpful:
UPDATE STATISTICS
http://msdn2.microsoft.com/en-us/library/aa260645(SQL.80).aspx
Statistics used by Query Optimizer in SQL Server 2005
http://www.microsoft.com/technet/prodtechnol/sql/2005/qrystats.mspx

Database Mirroring with SQL Authentication

It’s quite fascinating that a database can be mirrored across two sites and a front-end application will not have a clue about a failure due to a failover partner string specified in the connection string of the application.

Database mirroring was a new feature introduced in SQL Server 2005 which has seen a sizeable amount of implementation over the years. What people have not noticed is how the application connects to the database on two completely different servers w.r.t. authentication. Most organizations use Windows Authentication and after a database failover, users are able to connect seamlessly. When SQL Authentication is used to connect to the mirrored database after a failover and this login only has logon rights on the mirrored database, then it becomes a different ballgame altogether.

SQL Server stores the information of a login using a SID value. The SID value for Windows Authentications (Domain Accounts) are the same throughout the environment hence a problem never arises here when the database is failed. But for SQL Login. Rule of thumb tells us that the login used to login to a mirrored database should exist on both instances which hosts the principal and mirror databases. So, what happens when you use a SQL Authentication after a failover?? Since, SQL Server relies on SIDs to map a database user to a SQL login. You can find the information under sys.database_users and sys.syslogins catalog views. Since, after a failover the SIDs for a SQL Authentication is bound to have a mismatch (as SQL Authentication SIDs are machine specific and not something derived from the AD as in the case of Domain Accounts), you will start getting login failed or login timeout period expired for SQL Authentication.

To resolve such an issue make sure that the SID of the SQL Authenticated user is the same on both the principal and mirror server instances.

So, if you have run into an issue of having orphaned logins after failing over your mirrored database, then you can follow the steps below to resolve the issue:

1. Use the sp_help_revlogin script to script out the SQL Login

2. Drop the user on the mirrored instance and recreated it using the script obtained above

3. You would have to failover and re-map the SQL login to the database user under user mappings because if you have a database user and drop the SQL login associated with it , then the database user becomes orphaned.

This is documented both in the whitepaper on Database Mirroring on Technet as well as Books Online which talks about resolving orphaned users issue after a database failover.

Related Article:

http://blogs.msdn.com/sqlserverfaq/archive/2009/04/13/orphaned-users-with-database-mirroring-and-log-shipping.aspx

Troubleshooting Log Shipping Issues

Log Shipping is a feature in SQL Server by which you can ship transaction log backups to a different server and restore the backups onto a standby database for disaster recovery or reporting purposes.

One of the major fundamental differences in SQL Server 2000 and SQL Server 2005 log shipping is that SQL Server 2005 uses linked servers to communicate between the primary and monitor and the secondary and monitor. The log shipping jobs are executed via linked server queries to update information about the backup, copy and restore jobs in case you have a remote monitor server.

So, if you find that your log shipping reports are defunct, then two additional things that you can do apart from the basic log shipping troubleshooting steps are:

1. Check if remote connections are enabled on the primary, secondary and monitor. For SQL Server 2000, check connectivity between the instances.

2. Check if a linked server using the same authentication settings as your log shipping setup can be setup between the primary and monitor or secondary or monitor depending on which part of your log shipping setup is broken. This is again applicable for SQL Server 2005.

Basic Log Shipping Troubleshooting Steps

1. Look into the SQL Server ERRORLOGs, Application/System Event Logs for any errors related to log shipping. Check the job history of the log shipping Backup/Copy/Restore jobs for any errors.

2. Check the Backup, Copy & Restore job history details for any errors.

3. If you are using SQL Server 2005, check what details are being displayed in the Log Shipping Report under Standard Reports in Management Studio.

4. If you want to take your troubleshooting to the next level, then you can even look into the log shipping meta data by querying the log shipping tables on the primary/secondary/monitor(if configured).

Addendum: April 26th, 2011

The log shipping configuration information can be found using the following methods:

1. Standard Reports – Transaction Log Shipping Status (right click Server Name in Management Studio-> Reports -> Standard Reports -> Transaction Log Shipping Status)
2. Use the Stored Procedure which is called by the above report: EXEC master..sp_help_log_shipping_monitor

Query to check log shipping job errors using the MSDB log shipping system tables


--List of Log Shipping jobs
SELECT * from dbo.sysjobs WHERE category_id = 6
SELECT * FROM [msdb].[dbo].[sysjobhistory]
where [message] like '%Operating system error%'
order by [run_date] , [run_time]

SELECT * FROM [msdb].[dbo].[log_shipping_monitor_error_detail]
where [message] like '%Operating system error%'

SELECT * FROM [msdb].[dbo].[restorehistory] 

Known issues with Log Shipping

1. You might find that the last backed up/copied/restored files do not reflect correctly in the log shipping reports when you use a remote monitor server. In such a scenario, check if the following issue documented in the blog post below is applicable in your case:
http://blogs.msdn.com/b/sqlserverfaq/archive/2009/03/27/transaction-log-shipping-status-report-for-monitor-server-will-not-pull-up-information-if-alias-is-used-for-monitor-server.aspx

The last copied and restored file will show up as null if the monitor instance is not on the same box as the secondary instance. The last backed up file will show up as null if the monitor instance is not on the same box as the primary instance if the select @@servername value is not used as the monitor server name while configuring the log shipping monitor.

2. If “REMOTE ACCESS” (sp_configure will show if it is enabled or not) is not enabled or the LOG SHIPPING LINKED SERVER (to the monitor server) is not working for the primary and secondary servers, then last backup file/last copy file/last restored file information will not get populated if a remote monitor server instance is being used. The easiest way to identify this issue would be to capture a profiler trace (on primary instance when the backup job is running and on the secondary instance when the copy/restore job is running). The profiler trace will report errors if an update operation pertaining to the log shipping monitor tables fails provided all “Errors and Warnings” profiler events are captured.

3. Another issue that you could run into while using Log Shipping is Orphaned Users if you have Database Users on the Primary Database mapped to SQL Authenticated Logins. This happens because the SIDs of the SQL Authenticated Users on the Primary and Secondary instance would be different. I documented the workaround to this issue in the following blog post: http://blogs.msdn.com/b/sqlserverfaq/archive/2009/04/13/orphaned-users-with-database-mirroring-and-log-shipping.aspx

4. When you are scripting out an existing log shipping configuration, ensure that you have Cumulative Update Package 9 applied for SQL Server 2005 Service Pack 2 applied for Management Studio. If that is already done, then use one of the options mentioned in the more information section in the KB Article below:
955693 FIX: In SQL Server 2005, the file information about the transaction log that was last copied and the file information about the transaction log that was last restored are missing
http://support.microsoft.com/default.aspx?scid=kb;EN-US;955693

5. If you have configured LogShipping with STANDBY mode on SQL Server 2008 and the destination folder for the TLOGS uses a remote server (on which the sqlservice/sqlagent is not a Local Admin), then the restore job will fail everytime with following error :

2008-12-12 14:44:58.53 *** Error: During startup of warm standby database ‘testdb’ (database ID 7), its standby file (‘<UNC path of the TUF file>’) was inaccessible to the RESTORE statement. The operating system error was ‘5(Access is denied.)’.

TUF = Transaction Undo File which is required for applying the next T-LOG backup. This issued is fixed in the cumulative update mentioned in the KB Article below:

FIX: Error message when you use log shipping in SQL Server 2008: “During startup of warm standby database ‘<Database Name>’ (database ID <N>), its standby file (‘<File Name>’) was inaccessible to the RESTORE statement”
http://support.microsoft.com/kb/962008

6. Log shipping restore will fail if there is a snapshot or an active DBCC replica on the secondary database on which the restore is being done.
https://troubleshootingsql.com/2012/09/12/why-did-the-restore-fail-on-the-log-shipped-secondary-database/

Addition: September 12, 2012

Special cases
In case you need to speed up the transaction log restore for your log shipping secondary database in standby mode, then follow the steps mentioned in this post.

In case you need to move your secondary log shipped database files to a new physical location, then you can use the steps mentioned in this post.

How to connect to SQL Server 2005 Embedded Edition

From SQL Server Management Studio, you cannot connect to a SQL Server Embedded Edition instance by just providing the server name. You need to put the following in the server name input:

\\.\pipe\MSSQL$MICROSOFT##SSEE\sql\query

The files for an embedded edition instance should be at:
C:\WINDOWS\SYSMSI\SSEE\MSSQL.2005\MSSQL\

I have seen multiple people baffled by this question. Hence, I thought I would put up a post about it.

Using WMI to Manage SQL Server 2000 Services

There are quite a few blog posts out there to manage SQL Server 2005 services using the :\\.\root\Microsoft\SqlServer\ComputerManagement
However, WMI Admin Provider is not pre-installed for SQL Server 2000. It needs to be installed separately using the WMI Admin Provider Setup available along with the SQL Server 2000 Setup CD under x86\other\wmi folder. 

Sample script to change SQL Server 2005 service startup account and password using WMI:
http://blogs.msdn.com/mwories/archive/2006/11/03/wmi_5F00_change_5F00_password.aspx
MSDN Documentation on Win32_Service class
http://msdn.microsoft.com/en-us/library/aa394418.aspx 

Sample Script to change a SQL Server 2000 instance startup account using root\MicrosoftSQLServer namespace: 

strComputer = "." 

Set objWMIService = GetObject("winmgmts:\\" & strComputer & "\root\MicrosoftSQLServer") 

' Obtain an instance of the the class 

' using a key property value. 

Set objShare = objWMIService.Get("Win32_Service.Name='MSSQL$SQL2000'") 

' Obtain an InParameters object specific 

' to the method. 

Set objInParam = objShare.Methods_("Change"). _ inParameters.SpawnInstance_() 

' Add the input parameters. 

objInParam.Properties_.Item("StartName") = "LocalSystem" 

objInParam.Properties_.Item("StartPassword") = "" 

' Execute the method and obtain the return status. 

' The OutParameters object in objOutParams 

' is created by the provider. 

Set objOutParams = objWMIService.ExecMethod("Win32_Service.Name='MSSQL$SQL2000'", "Change", objInParam) 

' List OutParams 

Wscript.Echo "Out Parameters: "Wscript.echo "ReturnValue: " & objOutParams.ReturnValue