Troubleshooting SQL Server Hang using system_health Xevents

Recently my customer reported they were seeing intermittent server hangs, connection timeouts and slow performance on their SQL Server instance. The first question I had, do we have PSSDIAG running on the SQL Instance for me to be able to understand the cause of the hang. However since we won’t be running PSSDIAG proactively on any of the servers we didn’t had a logs to look at.  So we started looking at the system_health Xevents session data at the time of the slow performance.

If you are not familiar with default system health Xevent sessions, I would suggest you read the blog post from Bob Dorr here

Reading from System_Health Xevents session is not the most convenient way of troubleshooting and hence I was used a solution which is developed my fellow PFE colleague Denzil which has blogged here

We observed the issue occurs intermittently when the application hangs and doesn’t allow any further connections to the SQL Server.

Resource Utilization

We do not see CPU or Memory pressure on the SQL Server instance. The System CPU utilization is 40% on any average with some spikes causing the CPU Utilization to rise to 100 % however during these times the SQL CPU Utilization is consistent to 30-35% on an average.

The Available Memory on the server is unchanged and is consistent while there is no low memory alert observed on the SQL instance











Query Processing & THREADPOOL Waits

We see very high waits for THREADPOOL wait type which suggests we are running out of worker threads due to which we observed slowness and hangs on the SQL Instance. From the following Query Processing logs, we see that there is sudden surge in the worker threads from 198 to 806 worker threads in 10 min interval from 7:10 AM to 7:20 AM…. (Server time would be 2:10 AM to 2:20AM).






When we check sys.dm_exec_requests data captured during the same time we see there are concurrent requests to SQL instance increased from 80 to 500 during the same interval which justifies the surge in the worker threads created to serve the 500 concurrent requests.

However we need to understand here, THREADPOOL waitype is not the cause but a victim since the assigned worker threads are not completing their assigned tasks ASAP. The root cause of the issue in this case was the blocking observed on the SQL instance which prevents the worker threads to complete its task quickly and release the worker thread back to worker pool.


In this case, we found that the Head blocker is waiting on the non-io Buffer Latch PAGELATCH_XX (PAGELATCH_EX, PAGELATCH_UP, PAGELATCH_SH)  caused due to sequential inserts in a large table (Last Page Contention).  In this case we suggested to partition the table and build the indexes on alternate column to avoid Last page content (Further info on resolving latch contention can be found in this white paper).












Using the System Health Xevents session and solution developed by Denzil we were able to nail down the issue on the SQL Instance without capturing any additional data.

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer

Can number of worker threads in SQL Server exceed max worker threads setting?

Recently while troubleshooting a performance issue on the SQL Instance which was experiencing a very high THREADPOOL wait type I observed something which I was unaware until I hit this issue.

While I was looking at the system health xel files captured at the time of the issue, specifically at the query_processing event trace, the following highlighted line caught my attention.

The server has a 16 core x64 bit processors and as per the max worker thread calculation, the worker threads on the server should not exceed 512 + (16-4)*16=704  values. In the query_processing event, we observed the maxworkers correctly reported as 704, however I saw that workerCreated exceeded the maxWorkers which was unusual.










I checked Denzil R & Suresh Kandoth and they helped me understand that this is an expected behavior in SQL Server code since the max worker threads setting only controls the user tasks or in other user batch and doesn’t control the system tasks or internal tasks. The same is also documented in the msdn article as shown below

The max worker threads server configuration option does not take into account threads that are required for all the system tasks such as Availibility Groups, Service Broker, Lock Manager, and others. If the number of threads configured are being exceeded, the following query will provide information about the system tasks that have spawned the additional threads

FROM sys.dm_exec_sessions AS s
INNERJOIN sys.dm_exec_requests AS r
    ON s.session_id = r.session_id
INNER JOIN sys.dm_os_tasks AS t
    ON r.task_address = t.task_address
INNER JOIN sys.dm_os_workers AS w
    ON t.worker_address = w.worker_address
WHERE s.is_user_process = 0;

Don’t be surprised if you see the worker threads in sys.dm_os_workers to exceed the max worker threads and you can use the above query to identify the system task which are using the additional worker threads

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer

Connection to SQL 2012 Integration Services using SSMS for non-admin gives “Access is denied”

Consider a scenario where you installed SQL 2012 Integration Services and when non-admin is trying to connect to the Integration Service using SSMS , they receive an “Access is denied” error as shown in the screen below









This issue might be caused due to behavioral changes with security in  SSIS 2012 setup. In previous versions of SQL Server, by default when you installed SQL Server all users in the Users group had access to the Integration Services service. When you install the current release of SQL Server, users do not have access to the Integration Services service. The service is secure by default. After SQL Server is installed, the administrator must grant access to the service

To grant access to non-admin user to the Integration Services service

  1. Run Dcomcnfg.exe. Dcomcnfg.exe provides a user interface for modifying certain settings in the registry.
  2. In the Component Services dialog, expand the Component Services > Computers > My Computer > DCOM Config node.
  3. Right-click Microsoft SQL Server Integration Services 11.0, and then click Properties.
  4. On the Security tab, click Edit in the Launch and Activation Permissions area.
  5. Add users and assign appropriate permissions, and then click Ok.
  6. Repeat steps 4 – 5 for Access Permissions.
  7. Restart SQL Server Management Studio.
  8. Restart the Integration Services Service

If we still continue to see the Access is denied errors, we should following troubleshooting steps mentioned in the following KB article

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer




Oracle on Azure Presentation at the Windows Azure Conference 2014

Recently I presented at the Windows Azure Conference 2014 on Deploying & Configuring Oracle on Windows Azure as Infrastructure as a Service (IaaS).

As Organizations are looking to cut down their I.T infrastructure costs, they are looking at 2 options viz Virtualization or private cloud And Public Cloud. Microsoft’s vision, strategy & commitment to it’s customers is to provide private & public cloud platforms for the customers that will enable them to have hybrid deployment strategies to deploy & configure their software.

While Oracle’s vision & strategy is to give it’s customers greater choice and flexibility to deploy Oracle software on multiple platforms thereby increasing the usage of Oracle software.

Both Microsoft & Oracle shared a common vision of helping their customers move towards cloud thereby helping the customers reduce their IT costs without any compromise on the supportability & features. With this common vision, Microsoft & Oracle entered into a partnership back in June 2013 and following is the official statement from Microsoft & Oracle on the partnership.

“This partnership will help customers embrace cloud computing by providing greater choice and flexibility in how to deploy Oracle software”

Following is my presentation which I presented at the Windows Azure Conference 2014 held in Bangalore, India on 20,21st March, 2014.

Hope this Helps !!!
Parikshit Savjani
Premier Field Engineer

SSRS Report in Sharepoint Integrated Mode times out execution after 110 sec with unexpected error

Recently we faced a weird error while deploying an SSRS Report in a document library in SharePoint. The SSRS Report was  meant to be a Drill Through Report which was to be used as Detailed Report in PPS Dashboard exposed via SSAS Reporting Action.

Being a Detailed Report it was expected to fetch a lot of data and hence we paginated the tablix so that rendering in HTML is done fast. However for some parameters, the query itself took more than 2 mins to execute while fetching the data from cube ( We used SSAS Cube as data source for our cube).

However we observed the same report with same set of parameter was working fine while previewing it in SSDT ( BIDS for earlier versions) however after deploying the same report to SharePoint site, When we tried to browse the report in SharePoint it tried to load for some time but then we see an unexpected error occurred in the Web Page.

Our observation was, whenever the report was fetching a smaller set of data, the report render fast as expected but whenever the dataset was larger or in other words whenever the query execution took more than 100 sec, the report fails with unexpected error in Sharepoint. Further we timed the occurrence of the error and found the error occurs exactly after 110 sec of the report execution. This behavior gave us a clue that there was some timeout setting which was causing the report to timeout after 110 sec. Further this timeout setting is specific to SSRS in Sharepoint Integrated Mode.

After some research, I discovered that in Sharepoint Web.Config the httpruntime setting for executionTimeout is not specified which defaults to 110 sec.

The httpruntime executionTimeout setting specifies the maximum number of seconds that a request is allowed to execute before being automatically shut down by ASP.NET.


To resolve the issue, we need to modify the web.config for the Sharepoint site with SSRS Integrated (Web.Config for the Sharepoint site can be located from IIS Manager by exploring the site). In the Web.Config we need to search for httpruntime and add executionTimeout element as shown below

<httpRuntime maxRequestLength=”51200″ executionTimeout=”1800″ />

Following the change in Web.Config, Save it and restart IIS. After IIS reset, the changes take effect and we did not see the SSRS Report to timeout.

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer

SAML Authentication support with SQL Reporting Services

Couple of days back I received the following set of questions from a customer and in this blog post I would like to discuss the answer to his question for the benefit of others

How to get SSRS working with SAML Claims Authentication?

If SSRS in Native Mode doesn’t support SAML Authentication how does Microsoft Dynamics CRM support SAML Token Authentication using SSRS in native mode?

Can we leverage Microsoft Dynamics CRM to allow SSRS to support SAML Authentication?

Let us first start with basics before we answer customer’s question here.

SSRS in Native Mode supports

  • Windows Authentication
  • Basic Authentication
  • Custom Authentication

SSRS in Sharepoint Integrated Mode

  • Classic Windows Authentication
  • Claims Authentication  (Starting Sharepoint 2010 & SQL 2012 SSRS since SSRS is available Shared Service in Sharepoint and further C2WTS can convert a STS Token to Windows token which can be authenticated by external data sources)

So we all are very clear that Sharepoint natively supports Claims & SSRS and hence SSRS in Sharepoint Integrated mode is one of the favorite solution to get SSRS working with SAML Claims which is tried and tested but Customer is looking for options outside Sharepoint to save some cost

Although SSRS in Native mode doesn’t support Claim Authentication out of the box, it does support Custom Authentication. Custom Authentication gives a developer flexibility to develop his own security extension to authenticate the user and further authorize the user permissions. So if we have good programmers available, we can integrate any custom web application which supports SAML token to integrate with SSRS in native mode by developing security extensions for custom authentication.


Microsoft Dynamics CRM Team has leveraged the custom authentication of SSRS in native mode by developing their SSRS Security Extension ( this should explain why you need to install SSRS extensions for Dynamic CRM)  to authenticate & authorize the users in SSRS. Further the Microsoft CRM Team leverages the APIs exposed by the Reporting Service Web Service to deploy, delivery, subscribe for the reports.

While using SSRS in Dynamic CRM, the security is completely controlled from within CRM and as the user & security roles are defined in CRM.

While using SSRS in CRM mode, some of the functionalities & features of SSRS may not be available for e.g you cannot have custom code within your SSRS Reports since CRM uses RDL Sandboxing which doesn’t support custom code

When it comes to building SSRS reports for Microsoft Dynamics CRM 2011 using Visual Studio (Business Intelligence Development Studio, a feature that can be installed as part of SQL Server), there are two options available that provide you the ability to query and format your CRM data into flexible dynamic reports. The options are SQL reports querying the CRM Database Filtered Views or using Fetch, a proprietary query language commonly referred to as FetchXML, this language utilizes the CRM Report Authoring Extension that is to be installed alongside Visual Studio’s Business Intelligence Development Studio.

Although you can use develop SSRS reports in Dynamics CRM, you have very limited functionality & features as compared SSRS in native mode or Sharepoint Mode which practically makes your SSRS deployment useless.

You can read the following blog from a fellow PFE on the challenges of custom report development in Dynamic CRM

To answer customer question

  • Technically we can use SSRS with Microsoft CRM to support SAML but it will be available with restricted functionality which customer should be ready to accept
  • Approach 2: would be develop security extension for SSRS to support SAML but this would require skilled resources and would involve lot of efforts in developing & testing the security extension.
  • Preferred Approach would be Sharepoint 2010-2013 and SSRS 2012 which seamlessly supports SAML with all the SSRS functionality and further with SSRS 2012 you can set the execution context while using stored credentials which can eliminate the pains of Kerberos authentication and make life easier. Further SSRS in Sharepoint Integrated Mode is supported by Standard Edition of Sharepoint.

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer

Analyze SQL Database Table Space Usage using TreeMap Visualization in Excel 2013

One of fascinating features of Excel 2013 is “App for Office” option which exposes us to some of the great apps from Office store which you can download and install. One of such useful apps which I came across is TreeMap Data Visualization developed by Microsoft Research which allows you to represent hierarchical data of the dimensions  in the form of nested rectangles.

If you have used WindirStat to monitor space of your drive, you exactly know whatTreeMap looks like and how useful it is for hierarchical data.

TreeMap allows to plot two Measures simultaneously via Size and Color controls. Such Visualization can be useful in BI World when you want to compare hierarchical dimension members with two measures simultaneously .

In my example below, I am using TreeMap to monitor Table Space Usage for a Database viz Adventureworks2012 so that  by looking at the Treemap I can find out the table with large number of rows and which has highest unused free space which can released or reclaimed.

As we know, sp_spaceused in SQL Server helps me find the number of rows and unused space within the table, I use the following Stored Procedure to use sp_spaceused and cursor to query the stats from each table and load it into a separate table called TableStats

USE [AdventureWorks2012]
/****** Object:  Table [dbo].[TableStats]    Script Date: 9/1/2013 5:53:58 PM ******/
CREATE TABLE [dbo].[TableStats](
	[tableName] [varchar](100) NULL,
	[numberofRows] [varchar](100) NULL,
	[reservedSize] [varchar](50) NULL,
	[dataSize] [varchar](50) NULL,
	[indexSize] [varchar](50) NULL,
	[unusedSize] [varchar](50) NULL

CREATE PROCEDURE [dbo].[GetTableSpaceUsage] 

      DECLARE @TableName VARCHAR(100)
      DECLARE @SchemaName VARCHAR(100)    
      DECLARE @FullName VARCHAR(100) 
      TRUNCATE TABLE TableStats   

      DECLARE tableCursor CURSOR 
      FOR  select [name] from sys.objects  
      where  OBJECTPROPERTY(object_id, N'IsUserTable') = 1 FOR READ ONLY 

      DECLARE SchemaCursor CURSOR 
      FOR  select SCHEMA_NAME(schema_id) from sys.objects  
      where  OBJECTPROPERTY(object_id, N'IsUserTable') = 1 FOR READ ONLY 

      OPEN tableCursor 
      OPEN SchemaCursor
      FETCH NEXT FROM tableCursor INTO @TableName 
      FETCH NEXT FROM SchemaCursor INTO @SchemaName 

      WHILE (@@Fetch_Status >= 0) 
           SET @FullName=@SchemaName+'.'+@TableName 
           INSERT  TableStats         
           EXEC sp_spaceused @FullName     --Get the next table name     
           FETCH NEXT FROM tableCursor INTO @TableName 
           FETCH NEXT FROM SchemaCursor INTO @SchemaName 

      CLOSE tableCursor 
      DEALLOCATE tableCursor 
      CLOSE SchemaCursor 
      DEALLOCATE SchemaCursor


Once you run the above Stored Procedure it executes and loads the Table TableStats. Next Lets go to our Reporting Tool viz Excel 2013. In Excel 2013, Click on the Data and if you already don’t have existing connection to the Sql instance and database, create a new connection as shown below


You can use the following query to fetch the number rows and unused space from Table Stats. The unused space returned from the above stored procedure in string format as it also attaches KB at the end of the number. So we use the following query to get rid of the KB while importing

SELECT tablename, numberofrows, LEFT(unusedspace,LEN(unusedspace)-2) AS unused 
From TableStats

Once the data is loaded in Excel, you can go to the INSERT tab in Excel 2013 and click App for Office tab and add TreeMap to your Apps, Once you do that you can view the TreeMap view as shown below


In the Name List, you can select the tablename column from the data imported, for the size select the numberofRows column while for the Color select the Unused column.

You can add title to the TreeMap and hide the table and Gridlines following which you will see a great TreeMap as shown below



From the above TreeMap, it becomes very easy to visualize and identify that SalesOrderDetail with 121317 Number of Rows and while Person Table has the highest unused space of 2128 KB which makes it a good candidate to shrink.

There is another great post of TreeMap by Chris Webb who has shown the use of TreeMap and PowerQuery here.

If you would like to download the Excel Workbook, you can download it from here.

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer


Ebook Review: Windows Azure SQL Reporting Succintly by Stacia Misner

I recently came across a very useful set of free ebooks and great developer resources from syncfusion which you can see it for yourself here . Being an MS BI Developer, windows azure sql reporting succintly by Stacia Misner grabbed my attention and I immediately downloaded the ebook and started reading through it.

I had started my learning on SQL Reporting Services using Stacia Misner’s SQL 2008 Reporting Services book and since then I am a big fan of her books with simple, easy to understand language coupled with good examples to explain the concepts.

The best part of this book is, it is a small book with 100 pages as I kind off refrain myself from reading long e-books since we live in busy world and it is difficult to take the time out from our daily routine. Although it is a short book but it is precise and covers most functional aspects of SQL Reporting on Windows Azure.

The Ebooks starts well by comparing the difference between SQL Reporting Services (On Premise) with SQL Reporting on Windows Azure, and further covers little bit of SQL Azure Database and Database Migration to SQL Azure which is required to setup an environment required for Windows Azure SQL Reporting.

The book is pretty comprehensive covering topics from Reporting Design, Development, Management, Security and completely covers all aspects of Report Development Life Cycle (RDLC) and useful for an individual to get Level 200 -300 knowledge on Windows Azure SQL Reporting if you are planning to use Windows Azure SQL Reporting as a platform to host your reports on clouds.

There are some other good developer resources available for free download which you can find in the following link

Download Developer Resource (Ebooks) from SyncFusion


Happy Learning !!!

Hope this Helps !!!

Parikshit Savjani
Premier Field Engineer

How to provision Power BI App to your Office 365 Enterprise portal

With an excellent passionate & energetic presentation on Power BI by Amir Netz in WPC, there has been a lot of excitement and buzz for power bi within the BI Community (In case you haven’t watched the video yet, you can find it here ).  The community is now eagerly waiting to lay their hands on power bi preview.

With the release of Power BI app in Windows 8 app Store which is mobile app for Windows 8 device to view the Power BI Reports hosted on Power BI site in O365, there was a lot of confusion in the community which was clarified by AJ Mee in his blog and a video.

After installing the Power BI app in myWindows 8 device, I was more eager to upload some of the powerview reports designed by me in Excel 2013 to the Power BI site which is currently available only with O365 Enterprise Edition only which you can opt for Trial version as mentioned by AJ Mee in his video.

So I decided to try out the Enterprise Edition of O365 and try to provision Power BI site in my O365 portal, however I found that it was not easy to configure the Power BI site by yourself and I encountered the same issue as couple of community members were hitting as reported in the MSDN forum here

Luckily I was able to find an official documentation which was hidden somewhere in and not so easily accessible but luckily I was able to find it and this word document is a great help to configure and provision Power BI App for your Office 365 Sharepoint Portal.

After following the steps mentioned in this document Power BI – Provisioning Guide , I was able to configure the Power BI site for my O365 Sharepoint and was able to upload my Power View Excel files which could be further viewed by Power BI app on my Windows 8 device.

If you are keen on learning more on Power BI, you can find some great collection of resources put together by Vivek on Technet Wikis here

Hope this helps other community members !!!

Parikshit Savjani
Premier Field Engineer

SSIS Data Flow Task with Export to Excel 2007 xlsx Destination doesn’t retain the datatype from Source

In this blog post, I would like to address one typical limitation in SSIS Data Flow task which many of us might have experienced when you export the data to Excel 2007 (xlsx) Destination which has also been reported in the following connect item

It is actually a limitation of ACE Provider in the way it detects datatype for the excel destination thereby not preserving the datatype from the source.

In order to explain the issue, let me take a simple repro of the issue as shown below.

  • Consider a simple Data Flow task as shown below               


  • For this repro, I have a used an OLEDB Source as SQL Server, master database with following query

select name,crdate
from sysdatabases

  • I have used Multicast Operator since I wanted to broadcast the output to two Excel Destinations viz Excel 2003 Destination v/s Excel 2007 Destination.
  • I have created 2 Excel Destination to compare the output for Excel 2003 Destination (xls) format as compared to Excel 2007 Destination (xlsx) format.

When I open the Excel 2003 File, the output is as shown below (which is the expected and desired outcome).


When I open the Excel 2007 File, the output is as shown below (which is unexpected and undesired)


This can be annoying since one has format when you large number of columns of the type date, number, currency etc.  For the logical prespective, an obvious question would how does it work for Excel 2003 format and why it doesn’t work for Excel 2007 format which is supposed to be an advanced version. What has changed ?

The change is, the former use Jet Engine provider to export to the data to excel which has the ability to detect and preserve the data type from the source to the excel destination whereas the latter uses ACE Provider to export the data which uses different logic to detect the data type suitable for Excel destination and hence the issue.

If the Excel 2007 Destination File has atleast one row in the file, the ACE provider detects the data type of the Header row and uses the same datatype to populate the rest of the rows in the file.  We can use this behavior of ACE Provider to work around the issue.

So what options we have to work around the issue.

  1. If suitable, use Excel 2003 Destination to produce an xls file.
  1. Pre-create an excel destination file with a header row with dummy data and correct identified data types which serves a template for the ACE Provider for the rest of the rows. You can hide the first row


  You can right click on the dummy row and click Hide to hide the row from the end user

If hiding of the row doesn’t appeal you,  you can use Execute SQL Task following the Data Flow Task which loads the Excel destination to update the header row will null values after populating the data into the Excel Destination as shown below


One obvious thought would be, rather than updating it to null value, shouldn’t we delete the header rows using delete statement in Execute SQL Task. However Delete SQL statement is not supported for Excel connection type as mentioned in the following KB

However still if updating the header row to null values doesn’t convince, you can use the following visual c# code in SSIS Script task, to delete the header row.

However the following code references the Excel Interop dlls which are available by installing Primary Interop assemblies for Excel but it inturn requires you to install Office on the server (which may or may not be feasible for everyone)

using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using Excel = Microsoft.Office.Interop.Excel;

public void Main()
// TODO: Add your code here

Dts.TaskResult = (int)ScriptResults.Success;
Excel.Application xlApp = new Excel.Application();
Excel.Workbook xlWorkBook;
Excel.Worksheet xlWorkSheet;

xlWorkBook = xlApp.Workbooks.Open("C:/Users/pariks/Documents/ExcelExportIssue.xlsx", 0, true, 5, "", "", true, Microsoft.Office.Interop.Excel.XlPlatform.xlWindows, "\t", false, false, 0, true, 1, 0);
xlWorkSheet = (Excel.Worksheet)xlWorkBook.Worksheets[1];
Excel.Range rng = xlWorkSheet.UsedRange;


 xlWorkBook.SaveAs("C:/Powerpivot/ExportExcelIssueResolved.xlsx", Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing,

Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing);

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer