Addressing SQL Server and TDE with AKV errors

I recently wrote an Azure Data Studio Notebook on how to setup TDE for SQL Server 2019 Standard Edition (yes, SQL Server 2019 Standard Edition has TDE) using Azure Key Vault. I ran into a few issues that I had to debug, which I am outlining below. Make sure that you are following the pre-requisites when you are setting TDE with Azure Key Vault.

The first one was a 404 error. When I looked the application event log, I saw the following error:

Operation: getKeyByName
Key Name: ContosoRSAKey0
Message: [error:112, info:404, state:0] The server responded 404, because the key name was not found. Please make sure the key name exists in your vault.

The simple reason for the above error is that I was using an incorrect key name or the key didn’t exist in my Azure Key Vault. So the remediation is to check if the key exists in your Azure Key Vault. If not, then create the key.

Another error I ran into was a 401 error. The following information was included with the event:

Operation: acquireToken
Key Name:
Message: [error:108, info:401, state:0] Server responded 401 for the request. Make sure the client Id and secret are correct, and the credential string is a concatenation of AAD client Id and secret without hyphens.

The CREATE CREDENTIAL command has the following syntax:

CREATE CREDENTIAL Azure_EKM_TDE_cred WITH IDENTITY = 'SQLStandardKeyVault', -- for global Azure -- WITH IDENTITY = 'ContosoDevKeyVault.vault.usgovcloudapi.net', -- for Azure Government -- WITH IDENTITY = 'ContosoDevKeyVault.vault.azure.cn', -- for Azure China 21Vianet -- WITH IDENTITY = 'ContosoDevKeyVault.vault.microsoftazure.de', -- for Azure Germany SECRET = '<combination of AAD Client ID without hyphens and AAD Client Secret>' FOR CRYPTOGRAPHIC PROVIDER AzureKeyVault_EKM_Prov

The IDENTITY here is the name of your Azure key vault.
The SECRET here is your AAD Client ID (with the hyphens removed) and your AAD Client Secret concatenated together. You will need to create a “New Client Secret” for your Azure AD app registration. See steps here.

Your AAD Client ID will be a GUID and so will your Client Secret will be a random alphanumeric string. If you don’t have the client secret, then create new one and use that.

Upcoming sessions at Microsoft Ignite and PASS Summit

The next week of November is going to be an action packed week for me with two back to back conferences: Microsoft Ignite 2019, Orlando, Florida and PASS Summit 2019, Seattle, Washington.

Below are the sessions that I will be delivering at Microsoft Ignite.

Mission critical performance with SQL Server 2019 – In this session, my colleague, Kevin Farlee, and I will be talking about the various performance and scale improvements that SQL Server 2019 will be delivering at a great price performance that lets you run SQL Server 2019 with the best TCO for your Tier-1 workloads.

Azure SQL Database Edge – Overview – In this session, my colleague, Sourabh Agarwal, and I will talk about the new innovations we are bringing to the edge for ARM64 and x64 with Azure SQL Database Edge. We will also talk about some of the scenarios where Azure SQL Database Edge helped make our customers successful in their IoT applications.

Azure Arc: Bring Azure Data Services to On-Premises, Multi-Cloud and Edge – In this session, James Rowland Jones and I will walk you through the Azure Arc announcements and show you deployments of our data services on Azure Arc.

If you are going to Ignite and are interested in Data, then we hope to see you at our sessions.

At PASS Summit, I will be deliver another session on Azure SQL Database Edge which will talk more about how you can “Develop once, deploy anywhere” with our edge database offering.

Looking forward to see the #SQLFamily at PASS Summit and Microsoft Ignite.

SQL PASS Summit 2017

It is that time of the year when I get to meet the SQL Family. It is always wonderful to put a face to that Twitter handle that I exchanged #sqlchats with or connected with on LinkedIn. SQL PASS Summit is probably one of the largest gathering of data professionals under a single roof.

This year, I will be presenting a session on “Building One Million Predictions Per Second Using SQL-R”.

Date: Nov 3rd, 2017
Time: 11am
Room: Tahoma 5 (TCC Level 3)
Abstract:
Using the power of OLTP and data transformation in SQL 2016 and advanced analytics in Microsoft R Server, various industries that really push the boundary of processing higher number of transaction per second (tps) for different use cases. In this talk, we will walk through the use case of predicting loan charge off (loan default) rate, architecture configuration that enable this use case, and rich visual dashboard that allow customer to do what-if analysis. Attend this session to find out how SQL + R allows you to build an “intelligent data warehouse”.

There will be a number of sessions delivered at PASS from the Tiger team this year and you will find a lot of the folks at the SQL Server Clinic.

If you have a technical question, a troubleshooting challenge, give product feedback, or want to find out about best practices running your SQL Server? Then the SQL Clinic is the place you want to go to. SQL Clinic is the hub of technical experts from SQL Product Group (including Tiger team), SQL CAT, SQL Customer Support Services (CSS), SQL Premier Field Engineering (PFE) and others.

SQL PASS Summit gives me an unique opportunity to meet the #SQLFamily during an annual event, gather feedback from customers and get to see some old friends from across the globe!

Hope to see you at the event!

SQL Saturday 686–Upgrade your SQL Server like a Ninja

There is a plethora of Data Platform subject matter experts who will be descending at Portland this weekend to mingle with data professionals. What’s more a number of them will be surfacing a week later at the SQL PASS Summit as well.  I am eagerly looking forward to the data drive conversations. I have always been a big proponent of sharing the knowledge with the community and SQL Saturday is great way to do that.

There are multiple tracks which will be running on the same day. There are a number of sessions from various known community subject matter experts on various subjects on the tracks listed above and also from the Microsoft SQL Server Tiger team. I am really looking forward to this event! This will be an awesome prelude to PASS!

I will be presenting a session on upgrading your SQL Server and how we have helped Enterprises upgrade their SQL Server instances through various automations that are available using recent tools that Microsoft has shipped.

Session name: Upgrade your SQL Server like a Ninja

Duration: 70 minutes

Date: 28th Oct, 2017

Timing: 10.10am – 11.20am

Session Abstract

Are you thinking about upgrading your SQL Server environment to the latest and greatest that your data platform has to offer? If yes, then let us show you how you can perform risk-free and automated upgrades for SQL Server. In this session, you will see the new experiences Microsoft is building for performing Tier-1 SQL Server upgrades at scale through automated assessments, robust performance validation and using product features to minimize downtime.

Hope to see you at the event!

Managed PostgreSQL and MySQL for Azure

PostgreSQL and MySQL are ubiquitous names in the world of open source relational databases. These databases engines have been widely embraced by developers in recent years. Microsoft’s Azure cloud is a first-class platform for open source technologies that allow you to bring the tools you love and skills you already have, and deploy any applications. We have continued to build on our love for open source and the goal to provide developers with choices by announcing Azure Database for PostgreSQL and MySQL during the Microsoft Build 2017 conference in Seattle earlier this year. Scott Guthrie had announced the preview for managed database service for PostgreSQL and MySQL. We recently announced preview availability of Azure Database for PostgreSQL and MySQL in our Azure India region.

You can read our preview service launch announcements here:

·        Azure Database for PostgreSQL

·        Azure Database for MySQL

Create your own Azure Database for PostgreSQL and MySQL on our India region. One of our design principles aimed at providing a similar experience for any customer using PostgreSQL and MySQL community engines while working on our Azure offerings. This makes it as easy as 1-2-3 for any developer to switch their PostgreSQL and MySQL databases to Azure.

clip_image003

 

clip_image006

What is Azure Database for PostgreSQL and MySQL?

We worked closely with our database service architects on building a database systems foundation that could scale across SQL Server, MySQL and PostgreSQL. We built Azure Database for MySQL and PostgreSQL on the solid foundation of the Azure SQL Database service fabric-based architecture and worked together to extend it giving all services more capabilities with regards to storage options and performance. This allow us to build a trusted, flexible and intelligent database hosting platform with a very elastic pricing model for our customers.

clip_image008 

The above is a snippet from our Build presentation illustrating how Azure Database for PostgreSQL and MySQL is part of the overall Azure Database Services Platform where shared architecture yields benefits for each service that sits on top of it.

Azure Database for PostgreSQL is a managed database service that makes it easier to build apps without the management and administration burden. The service uses community edition of PostgreSQL and seamlessly works with native tools, drivers and libraries. At the start of preview, we are offering support for PostgreSQL versions 9.5 and 9.6.

Azure Database for MySQL is using the same MySQL Community Edition that is available from mysql.com which means you can use all the same tools you already use for development, management, performance profiling and monitoring.

Are you a developer using MySQL and PostgreSQL?

We have sample applications available on GitHub which allows you deploy our sample DayPlanner app using node.js or rubyonrails on your own Azure subscription. Day Planner App is a sample application that can be used to manage your day-to-day engagements. The app marks engagements, displays routes between them and showcases the distance and time required to reach the next engagement. This allows you the ability to evaluate the capability of our service using a sample application. If you have a sample application that would like to host on our repo or even have suggestions/feedback about our sample applications, please feel free to submit a pull request and become a contributor on our repo. We love working with our community to provide ready-to-go applications for the community at large.

Developers can accomplish seamless connectivity for our PostgreSQL and MySQL service using native tools that they are used to and continue to develop using Python, node.js, Java, PHP or any programming language of your choice. We support development with your favorite open source frameworks such as Djnago, Flask, etc., the service will work seamlessly.

clip_image010

The benefit of the Azure ecosystem

We provide deep integration between our other Azure offerings like Azure Web Apps, Monitoring, Security which allows you to benefit from the larger Azure ecosystem while using our managed PostgreSQL and MySQL offerings. An example of this if the Azure Database for MySQL’s integration with Azure Web Apps. Not only can you easily deploy an Azure Web App with Azure Database for MySQL as the database provider, but we’ve worked to provide complete solutions for common Content Management Systems (CMS) such as WordPress and Drupal. So now deploying a website is not only integrated and easy with a single provisioning process, but you have choice for a scalable database provider to handle your website traffic.

Over the next few months, you will continue to see many more such connected experiences and integrations to help you focus on application development and let Microsoft make the integration and administration easier.

What does our managed database offering provide?

Many managed database service offerings out there from different cloud providers promise to reduce database management complexity, but each one follows a different approach which forces customers to make several tradeoffs. We believe our approach to managed database services is unique in the industry and one that addresses the core pain points of an app developer and much more.

No mode administrative tasksWe address basic managed service capabilities such as automated patching, backup and restore capability, monitoring, alerting, logging. As you can see from the screenshot below, we allow you a management experience to restore older database backups which we take automatically for you. The differentiating factor for our service is that the user does not need to separately manage storage for backups. The service provides up to 35 days of retention for automated backups to be able to recover. This means, developers are now free from having to monitor, manage capacity and configure alerts on backup storage capacity. You will never have to worry about managing backups ever again! These backups are geo-redundantly backed up as well.

clip_image012

Built-in high availabilityAzure Database for PostgreSQL and MySQL offers built-in high availability feature out-of-the-box. You got it right – no additional setup, configuration or extra costs! This means as a developer you do not have to setup additional VMs and configure replication to ensure high availability for your PostgreSQL database. All our databases will have an SLA of 99.99% during General Availability.

Security at your serviceThe service has security baked in. All data including backups are encrypted on disk by default. No additional switch or planning required to secure your database! Furthermore, the service has SSL enabled by default which makes all data in-transit encrypted. Optionally, if your client application does not support SSL connectivity, we created an option to disable it as well. We are committed to working broadly across the open source communities to ensure these security gaps are addressed over time.

Elasticity at willElasticity is one of the foundational attribute of the cloud. We understand that it is hard for any developer to figure out how much compute and storage is required at the time of deploying your app since no one can predict how popular the app will become and pre-plan for peaks and valleys? As an example, Azure Database for PostgreSQL allows you to scale compute on the fly without application downtime in one step. The following is a simple Azure CLI command that you will use to scale PostgreSQL database servers up or down.

clip_image014

 

Monitoring, alerting and server logsThe service provides monitoring on metrics via integration with Azure Monitor service for up to 30 days, define alerting on those metrics, and configure server log retention period up to 7 days. In addition, users have the option to customize log verbosity parameter, so you can debug easily when developing and then tune it appropriately for production use. Further these metrics are integrated with many 3rd party tools. You get all of this for no additional costs!

clip_image016

Configurability and ExtensibilityAzure Database for PostgreSQL allows you to configure database server parameters. This provides the flexibility to customize based on your application requirements. The service, at preview currently supports over 18 popular PostgreSQL extensions (including PostGIS) with a roadmap to enable more based on user feedback.

The road to General Availability

In our pursuit of focusing on the fundamental areas such as reliability, security, elasticity and worry-free database management, there are few areas where we have still lots of opportunity to improve! We have several identified opportunities to improve performance. This will be one of the areas of active focus for us in the coming weeks and months. Besides performance, we will continue to increase the scaling limits and support larger compute and storage sizes. It is also our intent to create tighter integration with other Azure data and app services to make developing rich and intelligent apps simple and as easy as hitting few clicks. We will be keenly awaiting user feedback and suggestions on where we can improve and/or add new value.

This is really an exciting new chapter for us and represents just a beginning. We hope you will join us in this journey too! Stay tuned for more blog posts in the future.

Call to Action

We would ask that you try the service today. Please leave your feedback and questions below. You can also engage with us directly through User Voice if you have suggestions on how we can further improve the service. You can also find sample apps and ARM templates in our GitHub repo.

 

 

 

Data Platform Summit 2017

PrintI will be speaking at the Data Platform Summit happening at Bangalore from Aug 17-19, 2017. I will be delivering sessions on achieving a million predictions/sec using SQL Server and introduce our latest entrants into the Azure Database family: MySQL and PostgreSQL.

Session: CLOUD/IoT/NoSQL_C13Running Applications On Azure Using Postgres/MySQL
Date
: 17th Aug, 2017
Time: 11:30am – 12 noon
Session Type: Chalk Talk
Room: Kilimanjaro
Abstract: A chalk talk on Azure’s managed database offering for PostgreSQL and MySQL and how we make it easy to run your existing apps using PostgreSQL and MySQL on Azure.

Session: BIA_B02 – Advanced – Building 1 Million Predictions Per Second Using SQL-R
Date: 17th Aug, 2016
Time: 1:45PM – 3PM
Session Type: Breakout Session
Room: Hydra
Abstract: Using the power of OLTP and data transformation in SQL 2016 and advanced analytics in Microsoft R Server, various industries that really push the boundary of processing higher number of transaction per second (tps) for different use cases. In this talk, we will walk through the use case of predicting loan charge off (loan default) rate, architecture configuration that enable this use case, and rich visual dashboard that allow customer to do what-if analysis. Attend this session to find out how SQL + R allows you to build an “intelligent datawarehouse”.

In this session, I will show you the actual demo where we hit 1 million transactions/sec.

Session: CLOUD/IoT/NoSQL_B40 – Basic – Azure Database for MySQL and PostgreSQL
Date: 18th Aug, 2016
Time: 11:45AM – 1PM
Session Type: Breakout Session
Room: Sphinx
Abstract: Azure Database for MySQL and Azure Database for PostgreSQL are managed database services built for developers using the community edition of MySQL PostgreSQL. Learn how you can leverage these new choices of managed database services to build scalable, secure and intelligent apps. Using insights from current customer scenarios and through live demos, we walk through the service management capabilities, best practices to move your databases to the service, and also focus on how the Microsoft Azure ecosystem of app and data services is unlocking the potential of MySQL and PostgreSQL in the Azure cloud platform.

There is a surprise element in this session as well! And if I let the cat out of the bag now, it wouldn’t remain a surprise would it? Come to the session to find out what it is.

We also have other members from the SQL Server Tiger, SQL CAT, Data Migration and our field teams delivering sessions about various different data related topics. My colleague, Ajay Jagannathan, published a post on the upcoming Tiger sessions at the conference. There is a great list of accomplished speakers that will be delivering sessions at the conference. Apart from a great lineup, it always feels great to connect with customers and the community in general to learn about how they are using the products that we are building and gather feedback from the horse’s mouth!

Follow  SQLServerGeeks and the #DPS2017 hashtag on Twitter for new and exciting updates about the conference. We hope to meet you at the conference.

SQL Saturday 613: Building 1 million predictions per second with R-services and SQL Server 2016

image

Last Saturday, I presented a session on how to use R-Services with SQL Server to build an analytical workflow for banking solutions. I talked about how our customer, Jack Henry & Associates, an S&P 400 company that supports more than 11,300 financial institutions with core processing services, is leveraging the power of SQL Server and R to make drive intelligent insights into their data warehousing software. Below you will find a link on how you can setup the complete solution that you can deploy on our Data Science Virtual Machine on Azure.

Our Corporate Vice President, Joseph Sirosh, had demonstrated this solution along with Jack Henry & Associates at Ignite. In this session, I talked about the nuts and bolts on how to build a scalable predictive engine with SQL Server and using the enhancements shipped in SQL Server 2016. After this session, you will be able to build your very own scalable predictive engine on SQL Server 2016!

As always, it’s always great to meet my friends and the community at SQL Saturday events!

The slide deck used for my presentation can be found on SlideShare. The PowerBI dashboard and the demo scripts can be downloaded from the tigertoolbox repo on GitHub.