Azure Storage and SQL Server – Part 2

In the first post of this series, I had talked about the cheat sheet which documents the best practices for storage for SQL Server workloads running on Azure Virtual Machines. In this post, I am going to show how to automate additional checks for your virtual machine and your storage configuration. I am going to show how to determine if you have more than 40 disks in the storage account as they would not be able to sustain more than 20,000 IOPs as per the storage limits for Azure storage accounts.

The first part of the PowerShell script is used to find out the number of page blobs available in the storage account using the Get-AzureStorageContainer cmdlet. This allows me to fetch all the containers in a particular storage account which is specified as a parameter in the beginning of the script. The storage account context was created using the New-AzureStorageContext cmdlet. This is required for all the subsequent commands to ensure that the cmdlet executes against the correct storage account. This is required especially if you have more than one storage account associated with your subscription.

The next part of the script uses Get-AzureStorageBlob cmdlet to determine the number of blobs in a container. I used a foreach loop to determine the contents of each container. The reason I am using local variables to ensure that I minimize the network call to the Azure Storage service to get the filtering for both types of blobs in the storage container.

The last if-else block determines if you have more than 40 disks in the storage account. If yes, then it flags off an issue because if all the disks are active, they will not be able to scale to their potential of 500 IOPS. This is due to the fact that 20,000 IOPS per storage account limit will be divided across the disks available in the storage account.

The Powershell script and a screenshot of the output is shown below.

# Assign the storage account name

$StorageAccount = "<storage account name>"

 # Get the storage account key as this is needed for creating the storage context

$StorageKey = Get-AzureStorageKey $StorageAccount | %{ $_.Primary }

 # Create a new storage context for use in the next sections on the code block

$StorageContext = New-AzureStorageContext -StorageAccountName $StorageAccount -StorageAccountKey $StorageKey

 # Find out all the containers in the storage account

$Containers = Get-AzureStorageContainer -Context $StorageContext

 # Local variables for tracking the page and block blobs

$PageBlob = 0

$BlockBlob = 0

 # Get the disks in each container and count each type of blog present for the summary output

foreach ($name in $Containers)

{

$temp = Get-AzureStorageBlob -Container $name.Name -Context $StorageContext

$Blob = $temp | Where-Object {$_.BlobType -eq "PageBlob"}

$PageBlob += $Blob.Count 

$Blob = $temp | Where-Object {$_.BlobType -eq "BlockBlob"}

$BlockBlob += $Blob.Count 

}

Write-Host "INFO: There are" $Containers.Count "containers in the storage account:" $StorageAccount "which have" $PageBlob "Page Blob(s) and" $BlockBlob "Block Blob(s)" -ForegroundColor Green

 # Determine if 20K IOPS limit will be crossed due to having 40+ disks

if ($PageBlob -le 40)

{

Write-Host "INFO: This storage account has" $PageBlob "disk(s) which can support a total of" ($PageBlob*500) "IOPS" -ForegroundColor Green

 }

else

{

Write-Host "ISSUE: This storage account has" $PageBlob "disk(s) which can ONLY support a total of 20,000 IOPS" -ForegroundColor Red

 }

image

Previous post in the series

Azure Storage and SQL Server – Part 1
https://troubleshootingsql.com/2014/11/10/azure-storage-for-sql-server/

 

References

Azure Storage
http://azure.microsoft.com/en-us/services/storage/

Azure Subscription and Service Limits, Quotas, and Constraints
http://azure.microsoft.com/en-us/documentation/articles/azure-subscription-service-limits/#storagelimits

* This blog post has been written based on the service details available on 9th November, 2014.

Advertisement

Azure Storage for SQL Server

imageTo deploy a SQL Server instance on an Azure Virtual Machine, you will require the use of the Azure Storage service for your storage requirements. There are a few nuances of the Azure storage which requires prior knowledge so that your SQL Server instance runs without any glitches.

Before I start babbling about IOPs, data disks and regions, let’s first get a quick glimpse at what Azure storage offers. Azure storage offers blobs, tables and queues. The data disks that your database files would use are stored as blobs in Azure. One or more data disks are grouped into a single Azure Storage container. It is recommended that you store of all your data disks for a particular virtual machine in a single container. Each storage account has a limit of 500TB!

 

Here is the Cheat Sheet that we had shared during our session at the Microsoft India Teched 2014:

A maximum of 20,000 IOPs per storage account – Each storage account has a limit of 20k IOPs. It is recommended that your mission or business critical or virtual machines with high storage throughput requirements have their data disks residing in a single storage account. Additionally, this storage account should not be shared with another virtual machine or service. For the Standard tier, do not place more than 40 highly used VHDs in a storage account

• A maximum of 500 IOPs for each data disk – Each persistent data disk supports a maximum of 500 IOPs for the Standard Tier. It is advisable to

64-KB allocation unit size for data and log files as well as tempdb

• Use Storage Spaces on Windows Server 2012 and above – This allows you to group your data disks and leverage throughput greater than 500IOPs for the storage pool. For Windows 2008 R2 or earlier, you can use dynamic disks (OS striped volumes) and the stripe size is always 64 KB.

Do not store data on the temporary drive unless for tempdb and buffer pool extension on SSD drives (D-series VMs)

Separate data disks for data and log files – It is very important to determine your storage throughput requirements. As a best practice, having separate storage pools or data disks for your data and log files could go a long way in ensuring that you maintain optimal performance.

Caching policy = NONE – This needs to be set for all the data disks being used by the SQL Server instance

Backup to BLOB storage – When performing backups for SQL Server running in Azure virtual machines, you can use SQL Server Backup to URL. This feature is available starting with SQL Server SP1 CU2 and recommended for backing up to the attached data disks. Prior to SQL Server 2012, you can use SQL Server Backup to Azure Tool. This tool can help to increase backup throughput using multiple backup stripe targets.

Disable GEO-replication on storage account – It is advisable to use a locally redundant storage account for your SQL Server instance. The other benefit that you would get from a locally redundant storage is that the ingress and outgress limits (data moving in and out of the storage account) are higher.

In the next part of this blog post, I am going to use PowerShell magic to determine if the SQL Server instance that you have configured is following some of the best practices mentioned above.

Checking for geo-replication

The following PowerShell script used the Get-AzureStorageAccount cmdlet to get details of the storage account which hosts the virtual disks for the virtual machine. The script and the output is available below.

$StorageAccount = Get-AzureStorageAccount -StorageAccountName "<storage account name>"
if ($StorageAccount.GeoReplicationEnabled -eq $false)
{
    Write-Host "PASSED: Geo-replication is DISABLED" -ForegroundColor Green
}
else
{
   Write-Host "ISSUE: Geo-replication is ENABLED!" -ForegroundColor Green
}

image

Checking the caching policy

The PowerShell script below determines if the caching policy is enabled for the data disks using the Get-AzureVM cmdlet. This cmdlet fetches information about the virtual machine which is used to get the list of the data disks using the Get-AzureDataDisk cmdlet.

$DataDisks = Get-AzureVM -ServiceName "<service name>" -Name "<VM name>" | Get-AzureDataDisk
foreach ($disk in $DataDisks)
{
    if ($disk.HostCaching -eq "None")
    {
        Write-Host "PASSED: Disk caching is DISABLED for data disk: " $disk.DiskName "(" $disk.MediaLink ")" -ForegroundColor Green
    }
    else
    {
        Write-Host "ISSUE: Disk caching is ENABLED for data disk " $disk.DiskName "(" $disk.MediaLink ")" -ForegroundColor Red
    }
}

 

I will get a few more PowerShell scripts uploaded for automating the best practices checks for Azure VMs running SQL Server workloads.

References

Azure Storage
http://azure.microsoft.com/en-us/services/storage/

Azure Subscription and Service Limits, Quotas, and Constraints
http://azure.microsoft.com/en-us/documentation/articles/azure-subscription-service-limits/#storagelimits

* This blog post has been written based on the service details available on 9th November, 2014.

A little bit of POSH for document conversion

This is a completely non-SQL Server post. I had to recently convert a large number of word documents into Web Archive (.MHT) format. I would not mind doing that for one or two documents but when I have over 50 documents to perform this exercise on, it could get cumbersome and monotonous! This is where PowerShell came to the rescue.

The most common example available on the web is to convert Word documents to PDF. What I needed for the work that I was doing was a way to convert the Word documents to the Web Archive format. After a few Bing searches, I was able to determine that the Web Archive format enumeration number was 9. The following link has information about all the enumeration values: http://msdn.microsoft.com/en-us/library/office/bb238158(v=office.12).aspx

The PowerShell script below allows you to traverse a folder, pick all the word documents in the folders recursively and then convert each of those word documents in a .MHT file with the same name in the same location.

The script can be downloaded from OneDrive also.


<#

#################################################################################
    
Script Name: ConvertToWord                        
    Author: Amit Banerjee                            
    Date: April 28, 2014                            
    
Description:                                 
    This script takes a folder as an input and then converts the docx files present in the folder to web archive documents        
#################################################################################

This Sample Code is provided for the purpose of illustration only and is not 
intended to be used in a production environment. THIS SAMPLE CODE AND ANY 
RELATED INFORMATION ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER 
EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF 
MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. We grant You a 
nonexclusive, royalty-free right to use and modify the Sample Code and to 
reproduce and distribute the object code form of the Sample Code, provided that 
You agree: (i) to not use Our name, logo, or trademarks to market Your software 
product in which the Sample Code is embedded; (ii) to include a valid copyright 
notice on Your software product in which the Sample Code is embedded; and (iii) 
to indemnify, hold harmless, and defend Us and Our suppliers from and against 
any claims or lawsuits, including attorneys fees, that arise or result from the 
use or distribution of the Sample Code.

    The enumeration for the various document types that you can save as from Microsoft Word when you use the SaveAs option
    Const wdFormatDocument                    =  0
    Const wdFormatDocument97                  =  0
    Const wdFormatDocumentDefault             = 16
    Const wdFormatDOSText                     =  4
    Const wdFormatDOSTextLineBreaks           =  5
    Const wdFormatEncodedText                 =  7
    Const wdFormatFilteredHTML                = 10
    Const wdFormatFlatXML                     = 19
    Const wdFormatFlatXMLMacroEnabled         = 20
    Const wdFormatFlatXMLTemplate             = 21
    Const wdFormatFlatXMLTemplateMacroEnabled = 22
    Const wdFormatHTML                        =  8
    Const wdFormatPDF                         = 17
    Const wdFormatRTF                         =  6
    Const wdFormatTemplate                    =  1
    Const wdFormatTemplate97                  =  1
    Const wdFormatText                        =  2
    Const wdFormatTextLineBreaks              =  3
    Const wdFormatUnicodeText                 =  7
    Const wdFormatWebArchive                  =  9
    Const wdFormatXML                         = 11
    Const wdFormatXMLDocument                 = 12
    Const wdFormatXMLDocumentMacroEnabled     = 13
    Const wdFormatXMLTemplate                 = 14
    Const wdFormatXMLTemplateMacroEnabled     = 15
    Const wdFormatXPS                         = 18

#>

# Replace with the correct folder path
# Remove the -recurse option if you only want to convert the documents in the first level folders
# Retrieve the list of documents
$Files = Get-ChildItem "C:\Windows\*.docx" -recurse 

foreach ($File in $Files)
{
    # Create the name of the new document
    $Name = $File.FullName.replace(“docx”,”mht”)
    
    if (Test-Path $Name)
    {
        # Check if the file already exists
        # If it does then do not do anything
        Write-Host "Skipping conversion for " $Name  
    }
    else 
    {
        # Save the file as a web archive if it does not exist
        Write-Host "Creating file " $Name
        ConvertToMHT $File.FullName
    }
}

# Function to convert the file
function ConvertToMHT ($FileName)
{        
    # Create a word document object
    $Word=NEW-OBJECT –COMOBJECT WORD.APPLICATION
    # Open the word document
    $Doc=$Word.Documents.Open($FileName)
            
    # Replace with appropriate document format type using the enumeration provided above in the comments
    # Save the document in the required format in the same location
    [ref]$SaveFormat = "System.Object" -as [type]
    $Doc.saveas([ref] (($FileName).replace(“docx”,”mht”)),  [ref]9)
    # Quit word after closing the document
    $Doc.close()
    $Word.Application.Quit()
}

Reference:
http://blogs.technet.com/b/heyscriptingguy/archive/2013/03/24/weekend-scripter-convert-word-documents-to-pdf-files-with-powershell.aspx

Powershell for the perfmon files

Recently, I was working on a requirement to convert a number of BLG files into CSV and then changing the CSV files into Excel files.

So, the script below does the following:

1. Picks the BLG files from a folder and then creates a CSV file from the BLG file using RELOG.EXE

2. Then the same CSV file is converted to a XLS file

As always, this is a Powershell script which can be adapted in any way possible. Happy perfmon analysis!

The script can be downloaded from OneDrive as well.


#################################################################################
#                                        #
#                                        #
#                                        #
#                                        #
#    Script Name: Relog                            #
#    Author: Amit Banerjee                            #
#    Date: May 10, 2014                            #
#    Description:                                 #
#    The script uses relog to create CSV files from BLG files         #
#    It then converts the CSV files to XLS files                 #
#                                        #
#                                        #
#                                        #
#                                        #
#                                        #
#################################################################################

# This Sample Code is provided for the purpose of illustration only and is not intended to be used in a production environment. THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. We grant You a nonexclusive, royalty-free right to use and modify the Sample Code and to reproduce and distribute the object code form of the Sample Code, provided that You agree: (i) to not use Our name, logo, or trademarks to market Your software product in which the Sample Code is embedded; (ii) to include a valid copyright notice on Your software product in which the Sample Code is embedded; and (iii) to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys fees, that arise or result from the use or distribution of the Sample Code.

# Enumerates BLG files on disk and starts running relog on them
$path = "C:\PerfmonFiles\" # Replace with correct path
$files = Get-ChildItem $path -Filter *.blg
foreach ($file in $files)
{
    # Create the CSV filename
    $filename = $file.FullName.Split(".")[0] + ".csv"
    # Run relog with the correct arguments
    $AllArgs =  @($file.FullName,  '-f', 'csv', '-o',  $filename)
    & 'relog.exe' $AllArgs
}

# Enumerates CSV files on disk and starts converting them to Excel
$files = Get-ChildItem $path -Filter *.csv 
foreach ($file in $files)
{
    # Launch Excel
    $xls = new-object -comobject excel.application
    $xls.visible = $true
    # Open the CSV file
    $Workbook = $xls.workbooks.open($file.FullName)
    $Worksheets = $Workbooks.worksheets
    $filename = $file.FullName.Split(".")[0] + ".xls"
    # Save the file as Excel
    # Depending on your compatibility settings you might have to accept a prompt to save the file
    $Workbook.SaveAs($filename,1)
    $Workbook.Saved = $True
    # Close Excel
    $xls.Quit()
    # Delete the CSV file
    Remove-Item $file.FullName
}

T-SQL Tuesday #50: Database Role Membership

imageFirst of all, a Happy New Year to everyone!

This month’s T-SQL Tuesday topic on Automation provides a perfect opportunity for me to blog about an automation that can be achieved around database roles and it’s members.

A big thank you to Hemanth [T | B] for hosting this year’s first T-SQL Tuesday and to Adam Machanic [B|T] as his initiative just reached a milestone! 50 months filled with awesome SQL related blog posts on a Tuesday!

The ask was to create a database role matrix for each and every user database on the SQL Server instance. Now the catch here was to find out memberships in custom database roles as well.

So here is how to go about this.

First task would be to identify each and every fixed database role that a particular database user is part of. This is a fairly simple task as there are system catalogs to fetch this information.

The next task is to find out each and every custom database role that exists in each user database. This can be achieved using the T-SQL script below.

<span style="font-size: small;">
/*
# This Sample Code is provided for the purpose of illustration only and is not intended to be used in a production environment. THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. We grant You a nonexclusive, royalty-free right to use and modify the Sample Code and to reproduce and distribute the object code form of the Sample Code, provided that You agree: (i) to not use Our name, logo, or trademarks to market Your software product in which the Sample Code is embedded; (ii) to include a valid copyright notice on Your software product in which the Sample Code is embedded; and (iii) to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of the Sample Code.
# Author: Amit Banerjee
# Purpose: T-SQL script to fetch all the custom roles in a database
*/
DECLARE @ctr int = 1, @maxctr int = 1, @dbname sysname, @rsql nvarchar(4000)

CREATE TABLE #tblCustomRoles (rolename varchar(255))

SELECT identity(int,1,1) as databaseid, name
INTO #dbs
FROM sys.databases
WHERE database_id NOT IN (1,2,3,4)
AND state_desc = 'ONLINE'

SELECT @maxctr = COUNT(*) FROM #dbs
WHILE @ctr <= @maxctr
BEGIN
SELECT @dbname = name FROM #dbs WHERE databaseid = @ctr
SET @rsql = 'SELECT name FROM ' + @dbname +'.sys.database_principals where type_desc = ''DATABASE_ROLE'' and is_fixed_role = 0 and name <> ''public'''
INSERT INTO #tblCustomRoles
EXEC sp_executesql @rsql
SET @ctr += 1
END

SELECT DISTINCT ', CASE IS_MEMBER ('''+rolename+''') WHEN 1 THEN ''YES'' ELSE ''NO'' END as ' + quotename(rolename) as sqlstring
FROM #tblCustomRoles

DROP TABLE #tblCustomRoles
DROP TABLE #dbs


Now that you have list of custom database roles, you need to create a T-SQL query with the custom database role list and the fixed database role list. This query when executed against each user database will give you the database role matrix that you are looking for.

And this is where an automation scripter’s best friend, Powershell steps in. The script below:

1. Fetches the custom database roles from each database

2. Creates the T-SQL query for fetching the database role members

3. Executes the query against each database except for master, msdb, model and tempdb

4. Appends the output from each database into a CSV file


# This Sample Code is provided for the purpose of illustration only and is not intended to be used in a production environment. THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. We grant You a nonexclusive, royalty-free right to use and modify the Sample Code and to reproduce and distribute the object code form of the Sample Code, provided that You agree: (i) to not use Our name, logo, or trademarks to market Your software product in which the Sample Code is embedded; (ii) to include a valid copyright notice on Your software product in which the Sample Code is embedded; and (iii) to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of the Sample Code.
# Author: Amit Banerjee
# Purpose: Powershell script to fetch role membership for each database user in all databases in a single SQL Server instance
# Usage: .\GetPerms.ps1 -vFileName "<Output file name with fully qualified path>" -vServer "<SQL Server Instance Name>" -vScriptFile "<Location of GetCustomRoles.sql>"

Param ([string] $vFileName, [string] $vServer, [string] $vScriptFile)

# Declaring variables
$VCustomRoles = ""

# Fetch the names of the databases
$vDatabases = Invoke-Sqlcmd -ServerInstance $vServer -Database "master" -Query "select name from sys.databases where database_id not in (1,2,3,4) and state_desc = 'ONLINE'"

# Find non-system database roles
$vRoles = Invoke-Sqlcmd -ServerInstance $vServer -Database "master" -InputFile $vScriptFile

# Construct the SQL query to be executed
$vQuery = "SELECT
db_name() as DatabaseName
, prin.name as DBUserName
, srvprin.name as LoginName
, srvprin.create_date
, CASE IS_MEMBER ('db_owner') WHEN 1 THEN 'YES' ELSE 'NO' END as [db_owner]
, CASE IS_MEMBER ('db_accessadmin') WHEN 1 THEN 'YES' ELSE 'NO' END as [db_accessadmin]
, CASE IS_MEMBER ('db_securityadmin')  WHEN 1 THEN 'YES' ELSE 'NO' END as [db_securityadmin]
, CASE IS_MEMBER ('db_ddladmin')  WHEN 1 THEN 'YES' ELSE 'NO' END as [db_ddladmin]
, CASE IS_MEMBER ('db_backupoperator')  WHEN 1 THEN 'YES' ELSE 'NO' END as [db_backupoperator]
, CASE IS_MEMBER ('db_datareader')  WHEN 1 THEN 'YES' ELSE 'NO' END as [db_datareader]
, CASE IS_MEMBER ('db_datawriter')  WHEN 1 THEN 'YES' ELSE 'NO' END as [db_datawriter]
, CASE IS_MEMBER ('db_denydatareader')  WHEN 1 THEN 'YES' ELSE 'NO' END as [db_denydatareader]
, CASE IS_MEMBER ('db_denydatawriter')  WHEN 1 THEN 'YES' ELSE 'NO' END as [db_denydatawriter]
, CASE IS_MEMBER ('public')  WHEN 1 THEN 'YES' ELSE 'NO' END as [public]
"

# Foreach loop to identify each custom role in all the databases on the instance
foreach ($vRole in $vRoles)
{
$vQuery = $vQuery + $vRole.sqlstring + "`n"
}

# Complete the T-SQL query to be executed
$vQuery = $vQuery + "FROM sys.database_role_members mem
INNER JOIN sys.database_principals prin on mem.member_principal_id = prin.principal_id
INNER JOIN sys.database_principals rol on rol.principal_id = mem.role_principal_id
LEFT OUTER JOIN sys.server_principals srvprin on srvprin.sid = prin.sid
WHERE prin.is_fixed_role = 0 and srvprin.type in ('U','S')"

# Foreach loop to fetch the role membership of each user and append to a CSV file
foreach ($db in $vDatabases)
{
"Fetching database role matrix for database: " + $db.name
# NoTypeInformation prevents the #TYPE System.Data.DataRow from being the first row in the CSV file
# Use Invoke-SQLCMD to get the database role matrix for each database and append it to an output CSV file
Invoke-Sqlcmd -ServerInstance $vServer -Database $db.name -Query $vQuery | Export-Csv -Path $vFilename -Append -NoClobber -Encoding UTF8 -ErrorAction Stop -NoTypeInformation
}

The CSV output file shows a neat matrix as seen in the screenshot below:

image

Example command:

.\GetPerms.ps1 -vFilename G:\Perms.CSV -vServer MYSERVER -vScriptFile “G:\GetCustomRoles.sql”

Happy Automation!