Impact of Pre-grown Data & Log Files on the Restore Time of the Database

Today, I was asked the following question by one of my colleague who specializes in Sharepoint

If the actual data is 50GB in size & if we create a pre-grown data file of 200GB to avoid autogrow & fragmentation, the database restore time for the pre-grown database should be much larger than the restore time of the database without pregrowth with  datafile size of 50GB.

My initial guess was, the restore time shouldn’t vary too much since the data file creation should be fast due to instant file initialization, while the database restore operation only restores the active pages with data which should be same in both the scenarios & hence restore time shouldn’t vary too much. However this was purely based on my theoritical knowledge and it was always important to validate & test it.

So I used a sample database with initial primary datafile size of 4GB  & Log File size of 1GB with around 3GB of active data within the database.

To see the impact of the pre-grown datafiles, I kept the initial log file size constant to 1GB, while increased the initial datafile size in increments of 5 GB  keeping the active data in the database  constant at 3GB.

restorea

restore1

For my testing, I used SQL 2012 with SP1 & further I used backup compression while taking the backup each time I have pregrown the data file. As noted in the table above, irrespective of the data file size, the backup size was consistent at 266MB since the data was consistent & static in the database which was as per the expectation.

However, the restore time increased proportionally as the initial data files size increases which didn’t make much sense.

So I turned on the Trace Flag 3004 & 3605 to trace the restore events and identify why the restore time varies during each restores.

After turning on the Trace Flags, from the errorlog I discovered the following events which justified the time difference between the restores

Errorlog for Restore with 5GB data file size

12:57:46 PM spid56 Zeroing completed on C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\SampleDB_log.ldf 0:00:05
12:57:54 PM spid56 Zeroing completed on C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\SampleDB.mdf 0:00:08
12:58:04 PM spid56 Restore: Waiting for log zero on SampleDB 0:00:10

Errorlog for Restore with 20GB data file size

12:51:35 PM spid56 Zeroing completed on C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\SampleDB_log.ldf 0:00:06
12:52:38 PM spid56 Zeroing completed on C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\SampleDB.mdf 0:01:02
12:52:49 PM spid56 Restore: Waiting for log zero on SampleDB 0:00:11

As seen in the data above, the additional time spend during each restores were due to Zeroing of data files which means the instant file initialization was not working for me. I realized my SQL Server was running using the managed service accounts (NT Service\MSSQLServer) which does not possess Perform Volume Maintenance Task permissions by default. I assigned  the permissions to the NT Service\MSSQLServer and restarted the SQL Server & performed the above tests again. The following results were found

restore2

The above results does make sense & meets our expectation since with Instant File initialization, zeroing of the datafile is avoided which makes the restore time more or less same.

However instant file initialization doesn’t apply to Logfiles and we see the following results when we test the restore, keeping the initial size of data file constant while increasing the initial size of the Log file.

restore3

restore4

So the restore time does increase with the increase in the Log File size. Again when I turned on the tracing  to identify where the majority of the time is spent, I found that major time is spent in zeroing of log files.

However in case of the Log Files, the performance advantages of pre-grown Logfiles is significant due to number of VLFs created for every Autogrow  can slower the Database recovery time proportionally.

Hence with instant file initialization for data files & some compromise for restore time, it would still be recommended to pre-grow the data file & log files for the database to avoid Auto grow & fragmentation on the disk.

Autogrow should be used as insurance which you would want to have but not use it until you are in the dire need of it.

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer

Troubleshooting SQL Server Hang using system_health Xevents

Recently my customer reported they were seeing intermittent server hangs, connection timeouts and slow performance on their SQL Server instance. The first question I had, do we have PSSDIAG running on the SQL Instance for me to be able to understand the cause of the hang. However since we won’t be running PSSDIAG proactively on any of the servers we didn’t had a logs to look at.  So we started looking at the system_health Xevents session data at the time of the slow performance.

If you are not familiar with default system health Xevent sessions, I would suggest you read the blog post from Bob Dorr here

Reading from System_Health Xevents session is not the most convenient way of troubleshooting and hence I was used a solution which is developed my fellow PFE colleague Denzil which has blogged here

We observed the issue occurs intermittently when the application hangs and doesn’t allow any further connections to the SQL Server.

Resource Utilization

We do not see CPU or Memory pressure on the SQL Server instance. The System CPU utilization is 40% on any average with some spikes causing the CPU Utilization to rise to 100 % however during these times the SQL CPU Utilization is consistent to 30-35% on an average.

The Available Memory on the server is unchanged and is consistent while there is no low memory alert observed on the SQL instance

CPUUtil

MemoryUtil

 

 

 

 

 

 

 

 

Query Processing & THREADPOOL Waits

We see very high waits for THREADPOOL wait type which suggests we are running out of worker threads due to which we observed slowness and hangs on the SQL Instance. From the following Query Processing logs, we see that there is sudden surge in the worker threads from 198 to 806 worker threads in 10 min interval from 7:10 AM to 7:20 AM…. (Server time would be 2:10 AM to 2:20AM).

QueryProcessingEvent

 

 

 

 

When we check sys.dm_exec_requests data captured during the same time we see there are concurrent requests to SQL instance increased from 80 to 500 during the same interval which justifies the surge in the worker threads created to serve the 500 concurrent requests.

However we need to understand here, THREADPOOL waitype is not the cause but a victim since the assigned worker threads are not completing their assigned tasks ASAP. The root cause of the issue in this case was the blocking observed on the SQL instance which prevents the worker threads to complete its task quickly and release the worker thread back to worker pool.

Blocking

In this case, we found that the Head blocker is waiting on the non-io Buffer Latch PAGELATCH_XX (PAGELATCH_EX, PAGELATCH_UP, PAGELATCH_SH)  caused due to sequential inserts in a large table (Last Page Contention).  In this case we suggested to partition the table and build the indexes on alternate column to avoid Last page content (Further info on resolving latch contention can be found in this white paper).

 

Blocking

 

 

 

 

 

 

 

 

 

Using the System Health Xevents session and solution developed by Denzil we were able to nail down the issue on the SQL Instance without capturing any additional data.

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer

Can number of worker threads in SQL Server exceed max worker threads setting?

Recently while troubleshooting a performance issue on the SQL Instance which was experiencing a very high THREADPOOL wait type I observed something which I was unaware until I hit this issue.

While I was looking at the system health xel files captured at the time of the issue, specifically at the query_processing event trace, the following highlighted line caught my attention.

The server has a 16 core x64 bit processors and as per the max worker thread calculation, the worker threads on the server should not exceed 512 + (16-4)*16=704  values. In the query_processing event, we observed the maxworkers correctly reported as 704, however I saw that workerCreated exceeded the maxWorkers which was unusual.

workerthread

 

 

 

 

 

 

 

 

I checked Denzil R & Suresh Kandoth and they helped me understand that this is an expected behavior in SQL Server code since the max worker threads setting only controls the user tasks or in other user batch and doesn’t control the system tasks or internal tasks. The same is also documented in the msdn article as shown below

http://msdn.microsoft.com/en-us/library/ms190219.aspx

The max worker threads server configuration option does not take into account threads that are required for all the system tasks such as Availibility Groups, Service Broker, Lock Manager, and others. If the number of threads configured are being exceeded, the following query will provide information about the system tasks that have spawned the additional threads

SELECT
s.session_id,
r.command,
r.status,
r.wait_type,
r.scheduler_id,
w.worker_address,
w.is_preemptive,
w.state,
t.task_state,
t.session_id,
t.exec_context_id,
t.request_id
FROM sys.dm_exec_sessions AS s
INNERJOIN sys.dm_exec_requests AS r
    ON s.session_id = r.session_id
INNER JOIN sys.dm_os_tasks AS t
    ON r.task_address = t.task_address
INNER JOIN sys.dm_os_workers AS w
    ON t.worker_address = w.worker_address
WHERE s.is_user_process = 0;

Don’t be surprised if you see the worker threads in sys.dm_os_workers to exceed the max worker threads and you can use the above query to identify the system task which are using the additional worker threads

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer

Connection to SQL 2012 Integration Services using SSMS for non-admin gives “Access is denied”

Consider a scenario where you installed SQL 2012 Integration Services and when non-admin is trying to connect to the Integration Service using SSMS , they receive an “Access is denied” error as shown in the screen below

SSISError1

 

 

 

 

 

 

 

This issue might be caused due to behavioral changes with security in  SSIS 2012 setup. In previous versions of SQL Server, by default when you installed SQL Server all users in the Users group had access to the Integration Services service. When you install the current release of SQL Server, users do not have access to the Integration Services service. The service is secure by default. After SQL Server is installed, the administrator must grant access to the service

To grant access to non-admin user to the Integration Services service

  1. Run Dcomcnfg.exe. Dcomcnfg.exe provides a user interface for modifying certain settings in the registry.
  2. In the Component Services dialog, expand the Component Services > Computers > My Computer > DCOM Config node.
  3. Right-click Microsoft SQL Server Integration Services 11.0, and then click Properties.
  4. On the Security tab, click Edit in the Launch and Activation Permissions area.
  5. Add users and assign appropriate permissions, and then click Ok.
  6. Repeat steps 4 – 5 for Access Permissions.
  7. Restart SQL Server Management Studio.
  8. Restart the Integration Services Service

If we still continue to see the Access is denied errors, we should following troubleshooting steps mentioned in the following KB article

http://support.microsoft.com/kb/940232

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer

 

 

 

Oracle on Azure Presentation at the Windows Azure Conference 2014

Recently I presented at the Windows Azure Conference 2014 on Deploying & Configuring Oracle on Windows Azure as Infrastructure as a Service (IaaS).

As Organizations are looking to cut down their I.T infrastructure costs, they are looking at 2 options viz Virtualization or private cloud And Public Cloud. Microsoft’s vision, strategy & commitment to it’s customers is to provide private & public cloud platforms for the customers that will enable them to have hybrid deployment strategies to deploy & configure their software.

While Oracle’s vision & strategy is to give it’s customers greater choice and flexibility to deploy Oracle software on multiple platforms thereby increasing the usage of Oracle software.

Both Microsoft & Oracle shared a common vision of helping their customers move towards cloud thereby helping the customers reduce their IT costs without any compromise on the supportability & features. With this common vision, Microsoft & Oracle entered into a partnership back in June 2013 and following is the official statement from Microsoft & Oracle on the partnership.

“This partnership will help customers embrace cloud computing by providing greater choice and flexibility in how to deploy Oracle software”

Following is my presentation which I presented at the Windows Azure Conference 2014 held in Bangalore, India on 20,21st March, 2014.

Hope this Helps !!!
Parikshit Savjani
Premier Field Engineer

SSRS Report in Sharepoint Integrated Mode times out execution after 110 sec with unexpected error

Recently we faced a weird error while deploying an SSRS Report in a document library in SharePoint. The SSRS Report was  meant to be a Drill Through Report which was to be used as Detailed Report in PPS Dashboard exposed via SSAS Reporting Action.

Being a Detailed Report it was expected to fetch a lot of data and hence we paginated the tablix so that rendering in HTML is done fast. However for some parameters, the query itself took more than 2 mins to execute while fetching the data from cube ( We used SSAS Cube as data source for our cube).

However we observed the same report with same set of parameter was working fine while previewing it in SSDT ( BIDS for earlier versions) however after deploying the same report to SharePoint site, When we tried to browse the report in SharePoint it tried to load for some time but then we see an unexpected error occurred in the Web Page.

Our observation was, whenever the report was fetching a smaller set of data, the report render fast as expected but whenever the dataset was larger or in other words whenever the query execution took more than 100 sec, the report fails with unexpected error in Sharepoint. Further we timed the occurrence of the error and found the error occurs exactly after 110 sec of the report execution. This behavior gave us a clue that there was some timeout setting which was causing the report to timeout after 110 sec. Further this timeout setting is specific to SSRS in Sharepoint Integrated Mode.

After some research, I discovered that in Sharepoint Web.Config the httpruntime setting for executionTimeout is not specified which defaults to 110 sec.

The httpruntime executionTimeout setting specifies the maximum number of seconds that a request is allowed to execute before being automatically shut down by ASP.NET.

Reference: http://msdn.microsoft.com/en-us/library/e1f13641(v=vs.85).aspx

To resolve the issue, we need to modify the web.config for the Sharepoint site with SSRS Integrated (Web.Config for the Sharepoint site can be located from IIS Manager by exploring the site). In the Web.Config we need to search for httpruntime and add executionTimeout element as shown below

<httpRuntime maxRequestLength=”51200″ executionTimeout=”1800″ />

Following the change in Web.Config, Save it and restart IIS. After IIS reset, the changes take effect and we did not see the SSRS Report to timeout.

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer

SAML Authentication support with SQL Reporting Services

Couple of days back I received the following set of questions from a customer and in this blog post I would like to discuss the answer to his question for the benefit of others

How to get SSRS working with SAML Claims Authentication?

If SSRS in Native Mode doesn’t support SAML Authentication how does Microsoft Dynamics CRM support SAML Token Authentication using SSRS in native mode?

Can we leverage Microsoft Dynamics CRM to allow SSRS to support SAML Authentication?

Let us first start with basics before we answer customer’s question here.

SSRS in Native Mode supports

  • Windows Authentication
  • Basic Authentication
  • Custom Authentication

SSRS in Sharepoint Integrated Mode

  • Classic Windows Authentication
  • Claims Authentication  (Starting Sharepoint 2010 & SQL 2012 SSRS since SSRS is available Shared Service in Sharepoint and further C2WTS can convert a STS Token to Windows token which can be authenticated by external data sources)

So we all are very clear that Sharepoint natively supports Claims & SSRS and hence SSRS in Sharepoint Integrated mode is one of the favorite solution to get SSRS working with SAML Claims which is tried and tested but Customer is looking for options outside Sharepoint to save some cost

Although SSRS in Native mode doesn’t support Claim Authentication out of the box, it does support Custom Authentication. Custom Authentication gives a developer flexibility to develop his own security extension to authenticate the user and further authorize the user permissions. So if we have good programmers available, we can integrate any custom web application which supports SAML token to integrate with SSRS in native mode by developing security extensions for custom authentication.

References

http://msdn.microsoft.com/en-us/library/ms155029.aspx

http://msftrsprodsamples.codeplex.com/wikipage?title=SS2008!Security%20Extension%20Sample

Microsoft Dynamics CRM Team has leveraged the custom authentication of SSRS in native mode by developing their SSRS Security Extension ( this should explain why you need to install SSRS extensions for Dynamic CRM)  to authenticate & authorize the users in SSRS. Further the Microsoft CRM Team leverages the APIs exposed by the Reporting Service Web Service to deploy, delivery, subscribe for the reports.

While using SSRS in Dynamic CRM, the security is completely controlled from within CRM and as the user & security roles are defined in CRM.

While using SSRS in CRM mode, some of the functionalities & features of SSRS may not be available for e.g you cannot have custom code within your SSRS Reports since CRM uses RDL Sandboxing which doesn’t support custom code

When it comes to building SSRS reports for Microsoft Dynamics CRM 2011 using Visual Studio (Business Intelligence Development Studio, a feature that can be installed as part of SQL Server), there are two options available that provide you the ability to query and format your CRM data into flexible dynamic reports. The options are SQL reports querying the CRM Database Filtered Views or using Fetch, a proprietary query language commonly referred to as FetchXML, this language utilizes the CRM Report Authoring Extension that is to be installed alongside Visual Studio’s Business Intelligence Development Studio.

Although you can use develop SSRS reports in Dynamics CRM, you have very limited functionality & features as compared SSRS in native mode or Sharepoint Mode which practically makes your SSRS deployment useless.

You can read the following blog from a fellow PFE on the challenges of custom report development in Dynamic CRM

http://blogs.msdn.com/b/crminthefield/archive/2012/11/27/custom-reporting-in-microsoft-dynamics-crm-fetch-vs-filtered-views.aspx

To answer customer question

  • Technically we can use SSRS with Microsoft CRM to support SAML but it will be available with restricted functionality which customer should be ready to accept
  • Approach 2: would be develop security extension for SSRS to support SAML but this would require skilled resources and would involve lot of efforts in developing & testing the security extension.
  • Preferred Approach would be Sharepoint 2010-2013 and SSRS 2012 which seamlessly supports SAML with all the SSRS functionality and further with SSRS 2012 you can set the execution context while using stored credentials which can eliminate the pains of Kerberos authentication and make life easier. Further SSRS in Sharepoint Integrated Mode is supported by Standard Edition of Sharepoint.

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer

Analyze SQL Database Table Space Usage using TreeMap Visualization in Excel 2013

One of fascinating features of Excel 2013 is “App for Office” option which exposes us to some of the great apps from Office store which you can download and install. One of such useful apps which I came across is TreeMap Data Visualization developed by Microsoft Research which allows you to represent hierarchical data of the dimensions  in the form of nested rectangles.

If you have used WindirStat to monitor space of your drive, you exactly know whatTreeMap looks like and how useful it is for hierarchical data.

TreeMap allows to plot two Measures simultaneously via Size and Color controls. Such Visualization can be useful in BI World when you want to compare hierarchical dimension members with two measures simultaneously .

In my example below, I am using TreeMap to monitor Table Space Usage for a Database viz Adventureworks2012 so that  by looking at the Treemap I can find out the table with large number of rows and which has highest unused free space which can released or reclaimed.

As we know, sp_spaceused in SQL Server helps me find the number of rows and unused space within the table, I use the following Stored Procedure to use sp_spaceused and cursor to query the stats from each table and load it into a separate table called TableStats

USE [AdventureWorks2012]
GO
/****** Object:  Table [dbo].[TableStats]    Script Date: 9/1/2013 5:53:58 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[TableStats](
	[tableName] [varchar](100) NULL,
	[numberofRows] [varchar](100) NULL,
	[reservedSize] [varchar](50) NULL,
	[dataSize] [varchar](50) NULL,
	[indexSize] [varchar](50) NULL,
	[unusedSize] [varchar](50) NULL
) ON [PRIMARY]
GO

CREATE PROCEDURE [dbo].[GetTableSpaceUsage] 
AS 
BEGIN

      DECLARE @TableName VARCHAR(100)
      DECLARE @SchemaName VARCHAR(100)    
      DECLARE @FullName VARCHAR(100) 
      TRUNCATE TABLE TableStats   

      DECLARE tableCursor CURSOR 
      FOR  select [name] from sys.objects  
      where  OBJECTPROPERTY(object_id, N'IsUserTable') = 1 FOR READ ONLY 

      DECLARE SchemaCursor CURSOR 
      FOR  select SCHEMA_NAME(schema_id) from sys.objects  
      where  OBJECTPROPERTY(object_id, N'IsUserTable') = 1 FOR READ ONLY 

      OPEN tableCursor 
      OPEN SchemaCursor
      FETCH NEXT FROM tableCursor INTO @TableName 
      FETCH NEXT FROM SchemaCursor INTO @SchemaName 

      WHILE (@@Fetch_Status >= 0) 
      BEGIN
           SET @FullName=@SchemaName+'.'+@TableName 
           INSERT  TableStats         
           EXEC sp_spaceused @FullName     --Get the next table name     
           FETCH NEXT FROM tableCursor INTO @TableName 
           FETCH NEXT FROM SchemaCursor INTO @SchemaName 
      END 

      CLOSE tableCursor 
      DEALLOCATE tableCursor 
      CLOSE SchemaCursor 
      DEALLOCATE SchemaCursor

END

Once you run the above Stored Procedure it executes and loads the Table TableStats. Next Lets go to our Reporting Tool viz Excel 2013. In Excel 2013, Click on the Data and if you already don’t have existing connection to the Sql instance and database, create a new connection as shown below

ExcelData

You can use the following query to fetch the number rows and unused space from Table Stats. The unused space returned from the above stored procedure in string format as it also attaches KB at the end of the number. So we use the following query to get rid of the KB while importing

SELECT tablename, numberofrows, LEFT(unusedspace,LEN(unusedspace)-2) AS unused 
From TableStats

Once the data is loaded in Excel, you can go to the INSERT tab in Excel 2013 and click App for Office tab and add TreeMap to your Apps, Once you do that you can view the TreeMap view as shown below

AppsForOffice

In the Name List, you can select the tablename column from the data imported, for the size select the numberofRows column while for the Color select the Unused column.

You can add title to the TreeMap and hide the table and Gridlines following which you will see a great TreeMap as shown below

TreeMap

 

From the above TreeMap, it becomes very easy to visualize and identify that SalesOrderDetail with 121317 Number of Rows and while Person Table has the highest unused space of 2128 KB which makes it a good candidate to shrink.

There is another great post of TreeMap by Chris Webb who has shown the use of TreeMap and PowerQuery here.

If you would like to download the Excel Workbook, you can download it from here.

Hope this helps !!!

Parikshit Savjani
Premier Field Engineer

 

Ebook Review: Windows Azure SQL Reporting Succintly by Stacia Misner

I recently came across a very useful set of free ebooks and great developer resources from syncfusion which you can see it for yourself here . Being an MS BI Developer, windows azure sql reporting succintly by Stacia Misner grabbed my attention and I immediately downloaded the ebook and started reading through it.

I had started my learning on SQL Reporting Services using Stacia Misner’s SQL 2008 Reporting Services book and since then I am a big fan of her books with simple, easy to understand language coupled with good examples to explain the concepts.

The best part of this book is, it is a small book with 100 pages as I kind off refrain myself from reading long e-books since we live in busy world and it is difficult to take the time out from our daily routine. Although it is a short book but it is precise and covers most functional aspects of SQL Reporting on Windows Azure.

The Ebooks starts well by comparing the difference between SQL Reporting Services (On Premise) with SQL Reporting on Windows Azure, and further covers little bit of SQL Azure Database and Database Migration to SQL Azure which is required to setup an environment required for Windows Azure SQL Reporting.

The book is pretty comprehensive covering topics from Reporting Design, Development, Management, Security and completely covers all aspects of Report Development Life Cycle (RDLC) and useful for an individual to get Level 200 -300 knowledge on Windows Azure SQL Reporting if you are planning to use Windows Azure SQL Reporting as a platform to host your reports on clouds.

There are some other good developer resources available for free download which you can find in the following link

Download Developer Resource (Ebooks) from SyncFusion

 

Happy Learning !!!

Hope this Helps !!!

Parikshit Savjani
Premier Field Engineer

How to provision Power BI App to your Office 365 Enterprise portal

With an excellent passionate & energetic presentation on Power BI by Amir Netz in WPC, there has been a lot of excitement and buzz for power bi within the BI Community (In case you haven’t watched the video yet, you can find it here ).  The community is now eagerly waiting to lay their hands on power bi preview.

With the release of Power BI app in Windows 8 app Store which is mobile app for Windows 8 device to view the Power BI Reports hosted on Power BI site in O365, there was a lot of confusion in the community which was clarified by AJ Mee in his blog and a video.

After installing the Power BI app in myWindows 8 device, I was more eager to upload some of the powerview reports designed by me in Excel 2013 to the Power BI site which is currently available only with O365 Enterprise Edition only which you can opt for Trial version as mentioned by AJ Mee in his video.

So I decided to try out the Enterprise Edition of O365 and try to provision Power BI site in my O365 portal, however I found that it was not easy to configure the Power BI site by yourself and I encountered the same issue as couple of community members were hitting as reported in the MSDN forum here

Luckily I was able to find an official documentation which was hidden somewhere in office.microsoft.com and not so easily accessible but luckily I was able to find it and this word document is a great help to configure and provision Power BI App for your Office 365 Sharepoint Portal.

After following the steps mentioned in this document Power BI – Provisioning Guide , I was able to configure the Power BI site for my O365 Sharepoint and was able to upload my Power View Excel files which could be further viewed by Power BI app on my Windows 8 device.

If you are keen on learning more on Power BI, you can find some great collection of resources put together by Vivek on Technet Wikis here

Hope this helps other community members !!!

Parikshit Savjani
Premier Field Engineer