SQL Server

TSQL Tuesday #80 – AKA Chris Yates’s Birthday

Its the 80th T-SQL Tuesday and for some reason I decided to write a blog post.  This month’s host is a good friend, Chris Yates and since its his birthday he decided to give us a gift by opening the topic to “anything”

My topic of choice is simple and relates to Chris’s Present. SQL Server’s Birthday.  

I’ve worked with SQL Server in some form for longer than I care to admit so I thought it would be interesting to write down my memories of each release.  Why? I dont know, I guess because I can and perhaps just perhaps to compare notes with a few other “old-timers”

 

SQL Releases over the years  

SQL 6.5

  • a GUI on a DB? This was sheer madness
  • SQL Enterprise manager was revolutionary & light years ahead of the competition
  • Corruption, Corruption Corruption
  • I still miss ISQL_w sometimes (not really)
  • Shipped with actual printed books

SQL 7.0

  • Side by side unusable installs of SQL 6.5 and 7.0 during upgrades
  • I do miss the standalone Query Analyzer tool
  • DTS… oh the nightmares of DTS, they all start here

SQL 2000

  • Someone put XML in a database… AKA: Patient zero
  • You get a bigint and you get a bigint and you, bigints for everyone!
  • There was a disturbance in the force and SSRS appeared (Someone say it was SSAS that caused the disturbance but I disagree)
  • Put multiple instances of SQL on a server they said, it’ll be fun they said…… NOT

SQL 2005

  • SSIS appears, the lights suddenly dim and a clown jumps out of a box and says “Surprise”
  • Upgrade pain like no other “80” compatibility still haunts many-a-DBA
  • CLR is banished by DBA’s everywhere for being the devils magic and clearly evil
  • DTA appears and is quickly crowned with a dunce cap
  • SSNS came and went so quickly no one ever saw it
  • TSQL Window functions appear and developers rejoice!
  • BIDS shows up and BI developers lives are never the same

SQL 2008

  • PBM appears and quickly goes stale
  • Compression of all the things arrives but only if you spend mega bucks
  • DMV’s finally became useful
  • intellisense? what is this dark art that saves me from typing SLECT 1 more time
  • AlwaysOn, Alwayson, Always on, Who knows what it will be called today but the moniker appeared and Allan Hirt suddenly had more grey hair

SQL 2008R2

  • The SQL team joined the “R2” Parade and quickly got lost
  • The what edition of SQL do I need to do X game gets into high gear
  • PowerName All the things starts with PowerPivot
  • MasterDataServices appears and quickly goes stale

SQL 2012

  • Core licensing appears and the value proposition gets instantly more complicated
  • SSIS was disemboweled (Thankfully)
  • Column Store indexes were created to give Niko something to write about
  • AlwaysOn Availability Groups make DBA’s look like superheros
  • Extended event shaming becomes a thing
  • BIDS? SSDT? SSDT-BI? Data tools? The confusion is maddening

SQL 2014

  • The what edition do I need game takes another twist
  • SSMS is still treated like a stepchild
  • Hekaton changes the game (If you meet this teeny tiny list of caveats)
  • Azure is melted into the box product

SQL 2016

  • Someone put JSON in a database… AKA: Patient zero
  • Dynamic data masking and Row level security brings all the auditors to the yard
  • AlwaysEncrypted continues the pattern of “Always or Power” all the names!
  • QueryStore appears and DBA’s the world over are immediately confused by this black magic
  • Temporal tables appear and SQL Server catches up with postgres circa 2012
  • SSMS starts getting major love, everyone wonders if its a trick
  • SSDT is… wait…for…it… a SINGLE tool finally!

Future SQL Server

  • Can we get indexes on functions expressions *pretty please* If a mainframe can do it surely SQL can too
  • Banish the message “string or binary data may be truncated” to the history books
  • How about a tiny touch of love for Powershell (those guys yell loud enough that powershell can cure all my ills, how about giving them a proper hammer)
  • An edition based on size of server, pay for the T-SQL features you use based on server/database size *things like hekaton on small db’s* in a new “AlwaysPower” edition
  • *basic* Scalar UDF optimization to cache deterministic values

What did I miss?  Wait, I know I missed a lot so dont answer that.

Extending automation of SQL recoveries using Ola Hallengren’s backup solution

“Automation is inherently good” This might be one of the only phrases you can get all DBA’s to agree on without that dreaded “it depends” that we DBA’s like to throw around so much.

This months T-SQL Tuesday is about automation and I thought I’d write about extending existing automation.  One of my favorite scripts for automation is Ola Hallengren’s Backup & Maintenance solution.  Ola’s scripts are a fantastic way to automate highly configurable backups and maintenance on your SQL Server instances.  If you’re not using them, you should seriously consider looking into why.

This solution serves as an outstanding base but like anything else its can be useful to tweak things a bit.  Extending the initial automation provided by his scripts is what this post is all about.

In particular, I’ve modified Ola’s scripts to generate the files needed to restore all of the databases that have been backed up with his solution.  In particular, having the ability to easily restore the whole server in the case of a disaster.  Though, you could easily pull out one DB to only restore it.  This script is currently only written for litespeed since that’s what I use for backups.  However, it could easily be changed to support native backups or any of the other backup products that Ola’s scripts can be configured for.  Perhaps Ill work on those in the future if it would be useful.

The idea is that every time you take a backup the backup job will create a .sql file on the server filesystem in the backup directory that can be used to restore to the point of the backups that were just taken.

This solution includes three pieces, an additional stored procedure, an additional step in both of the backup jobs to execute that stored procedure and lastly a step in the cleanup procedure to remove the restore scripts from the filesystem that have aged.

A couple of notes of caution:

As with anything you find on the internet, please use at your own risk in a development/test system and proceed with caution.

This script makes several assumptions including

  • That you’ve installed Ola’s commands into the master database
  • That you’re using litespeed
  • That logging to the commandlog table is enabled

The stored procedure is relatively simple and accepts a single parameter @type  “LOG” will generate the script as of the last log backup taken or for any other parameter, I happen to use “FULL”, it generates the script based on the last full backup.

CREATE PROCEDURE [dbo].[GenerateRestoreScript] (@type NCHAR(30) = 'LOG')
AS 
DECLARE @ID INT
DECLARE @DB NVARCHAR(128)

SET NOCOUNT ON

SELECT  @ID = MAX(database_id)
FROM    sys.databases

IF @type = 'LOG' SET @type = 'xp_backup_log' ELSE SET @type = ''
--These intentionally not commented in the script as a precaution (to generate an error)
        SELECT 'ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-'
        SELECT 'PLEASE BACKUP THE TAIL OF THE LOG
        SELECT 'OTHERWISE THIS COULD BECOME A RGE (GOOGLE THE ACRONYM!)'
        SELECT 'IF YOU ARE OK REPLACING THE DB AND LOOSING DATA IN THE TAIL LOG RUN THIS SCRIPT WITHOUT THESE COMMENTS '
        SELECT 'RAISERROR(N''ARE YOU SURE YOU WANT TO DO THIS?'', 25,1) WITH Log;'      
        SELECT '--------'

WHILE @ID > 2
    BEGIN

        SELECT  @DB = NAME
        FROM    sys.databases
        WHERE   database_id = @ID

        SELECT  @ID = @ID - 1

        SELECT '----' + @DB + '-----------------------------------------------------'  

        SELECT 'EXECUTE ' 
        + REPLACE(Command, '_backup_', '_restore_')
        + ', @filenumber = 1, @with = N''' 
        + CASE WHEN rn <> 1 THEN 'NO' ELSE '' END 
        +  'RECOVERY'''
        + CASE WHEN CommandType = 'xp_backup_database' THEN ', @with = N''REPLACE'';' ELSE ';' END

         FROM 
        (
        SELECT 
            SUBSTRING(LEFT (Command, CHARINDEX(''', @with =',Command)),CHARINDEX('[master]',Command),LEN(Command)) AS Command
            , ROW_NUMBER() OVER (ORDER BY cl.ID DESC) AS rn
            , CommandType
        FROM    [master].[dbo].[CommandLog] cl
        WHERE   cl.DatabaseName = @DB
                AND (cl.CommandType = 'xp_backup_database' OR cl.CommandType = @type)

                AND cl.ID >= ( SELECT   MAX(ID)
                               FROM     CommandLog c
                               WHERE    CommandType IN ( 'xp_backup_database' )
                                        AND cl.DatabaseName = c.DatabaseName
                             )
        ) AS rntab

        ORDER BY rn DESC                     

    END

To execute the stored procedure, this needs to be added as an additional cmdexec job step to the Full backup job (make sure to change the directory where you want the .sql files stored (H:\SERVERNAME below))

sqlcmd -l 30 -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -y 0 -b -Q "EXEC [dbo].[GenerateRestoreScript] ''FULL''" –o”H:\SERVERNAME\DRFULL_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM))_RESTORE.sql" –w50000

To execute the stored procedure, this needs to be added as an additional cmdexec job step to the Transaction log backup job (make sure to change the directory where you want the .sql files stored (H:\SERVERNAME below))

sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -y 0 -b -Q "EXEC [dbo].[GenerateRestoreScript]" -o"H:\SERVERNAME\DRLOG_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM))_RESTORE.sql" –w50000

This cmdexec Job step that needs to be added to the output file cleanup job to clean up old .sql files (make sure to change the directory where the .sql files stored (H:\SERVERNAME below))

Note: currently this configuration keeps the files from the past 3 days but the actual files kept depends on when the cleanup job is scheduled.

cmd /q /c "For /F "tokens=1 delims=" %v In (''ForFiles /P "H:\SERVERNAME" /m *RESTORE.sql /d -3 2^>^&1'') do if EXIST "H:\SERVERNAME"\%v echo del "H:\SERVERNAME"\%v& del "H:\SERVERNAME"\%v"

I have these steps scripted into Ola’s original solution .sql so the folder names are set properly and job creation is completely automated.  Ill leave that part of extending automation to you, dear reader, as homework.

Finding impersonation info in SQL Server

Since SQL Server 2005 we’ve been able to grant one login the ability to impersonate another in order to have a different set of effective permissions.  More than a few times I’ve found this paradigm useful in order to solve problems while adhering to the principal of least privilege and likewise more than a few times I’ve tried to find information about who has impersonation privileges in SSMS and have been severely disappointed to find that it is buried.  This extremely simple script solves that problem and gives all of the information about who has impersonation rights on an instance.

 

If you want to read more about all the fantastic things you can do with EXECUTE AS and impersonation, MSDN is a good place to start

SELECT  grantee_principal.name AS WhoCanImpersonate ,
        grantee_principal.type_desc AS ImpersonatorType ,
        sp.name AS WhoCanTheyImpersonate ,
        sp.type_desc AS ImpersonateeLoginType

FROM    sys.server_permissions AS prmssn
        INNER JOIN sys.server_principals AS sp
            ON sp.principal_id = prmssn.major_id
               AND prmssn.class = 101
        INNER JOIN sys.server_principals AS grantee_principal
            ON grantee_principal.principal_id = prmssn.grantee_principal_id
WHERE   prmssn.state = 'G'

Adventures in SSRS

Came across a strange permission issue today that I thought I’d blog about since it is apparently somewhat common judging by the number of unanswered forum posts about it.

Adding/editing or Testing SSRS Datasources in Report Manager generates a permission error

A developer was trying to add a datasource and use “test connection” in the report manager web interface instead of BIDS which, incidentally worked.  When they tried to save or test the data source, an error occurred “The permissions granted to user “Domain\User” are insufficient for performing this operation. (rsAccessDenied)”  After much websearching and digging through verbose SSRS logs, SQL Traces, Windows Event logs, Process Monitor traces, etc.  I had just about given up hope and turned to twitter to vent about the error.  Luckily, Chris Testa-O’Neill , Doug Lane, and Tamera Clark came to my rescue and headed me down the right path of this being an SSRS catalog permission issue (Thanks again!)

Environment Details

SSRS 2008r2

Folder inheritance is broken and the developer account has Browser permissions on the root folder as well as Browser, Publisher and Report Builder permissions on the sub-folder.  There are no system level permissions for the account.

The Problem

In the SSRS error log I found the following messages which are only slightly more helpful (Highlighted for influence)

System.Web.Services.Protocols.SoapException: System.Web.Services.Protocols.SoapException: The permissions granted to user ‘Domain\User’ are insufficient for performing this operation. —> Microsoft.ReportingServices.Diagnostics.Utilities.AccessDeniedException: The permissions granted to user ‘Domain\User’ are insufficient for performing this operation.
at Microsoft.ReportingServices.WebServer.ReportingService2010Impl.TestConnectForDataSourceDefinition(DataSourceDefinition DataSourceDefinition, String UserName, String Password, String& ConnectError)…

The command is failing on the “TestConnectForDataSourceDefinition” method.  The permissions needed are those required by this method.

The Fix

Near the bottom of the documentation for the TestConnectForDataSourceDefinition the permissions required are listed as “Execute report definition”

Granting this permission is accomplished by connecting to the SSRS server with SSMS and creating a new system role with just this permission

Then grant the new role “Execute Report Definitions” permission

Finally add the user account to the new system role, this is done in report manager under Site Settings ->Security -> New Role Assignment

It was 2 good years

The voters have spoken and I will not be returning to the PASS board as an elected director in 2013.  A week ago I wrote a post about campaigning being hard and in it I wrote something that I believe and keep coming back to.

I’ve spent many thousands of hours of my life serving the PASS organization and want the opportunity to spend a few hundred thousand more in the same role over the next two years. However, if by chance the community wants to see someone else in my seat on the board I can live with that, there aren’t any bad choices.

As much time and energy as I’ve put into PASS over the years this outcome stung a bit personally but I’ve come to accept the reality of it.  Thanks to everyone who supported me in this election cycle, I was truly humbled to receive your support.

Congrats to JRJ, Sri and Wendy.  I’m sure yall will do a great job keeping the organization moving forward on the right track.

Go to Top