This user hasn't shared any profile information
Home page: http://allenkinsel.com
Posts by Allen Kinsel
Its the 80th T-SQL Tuesday and for some reason I decided to write a blog post. This month’s host is a good friend, Chris Yates and since its his birthday he decided to give us a gift by opening the topic to “anything”
My topic of choice is simple and relates to Chris’s Present. SQL Server’s Birthday.
I’ve worked with SQL Server in some form for longer than I care to admit so I thought it would be interesting to write down my memories of each release. Why? I dont know, I guess because I can and perhaps just perhaps to compare notes with a few other “old-timers”
SQL Releases over the years
- a GUI on a DB? This was sheer madness
- SQL Enterprise manager was revolutionary & light years ahead of the competition
- Corruption, Corruption Corruption
- I still miss ISQL_w sometimes (not really)
- Shipped with actual printed books
- Side by side unusable installs of SQL 6.5 and 7.0 during upgrades
- I do miss the standalone Query Analyzer tool
- DTS… oh the nightmares of DTS, they all start here
- Someone put XML in a database… AKA: Patient zero
- You get a bigint and you get a bigint and you, bigints for everyone!
- There was a disturbance in the force and SSRS appeared (Someone say it was SSAS that caused the disturbance but I disagree)
- Put multiple instances of SQL on a server they said, it’ll be fun they said…… NOT
- SSIS appears, the lights suddenly dim and a clown jumps out of a box and says “Surprise”
- Upgrade pain like no other “80” compatibility still haunts many-a-DBA
- CLR is banished by DBA’s everywhere for being the devils magic and clearly evil
- DTA appears and is quickly crowned with a dunce cap
- SSNS came and went so quickly no one ever saw it
- TSQL Window functions appear and developers rejoice!
- BIDS shows up and BI developers lives are never the same
- PBM appears and quickly goes stale
- Compression of all the things arrives but only if you spend mega bucks
- DMV’s finally became useful
- intellisense? what is this dark art that saves me from typing SLECT 1 more time
- AlwaysOn, Alwayson, Always on, Who knows what it will be called today but the moniker appeared and Allan Hirt suddenly had more grey hair
- The SQL team joined the “R2” Parade and quickly got lost
- The what edition of SQL do I need to do X game gets into high gear
- PowerName All the things starts with PowerPivot
- MasterDataServices appears and quickly goes stale
- Core licensing appears and the value proposition gets instantly more complicated
- SSIS was disemboweled (Thankfully)
- Column Store indexes were created to give Niko something to write about
- AlwaysOn Availability Groups make DBA’s look like superheros
- Extended event shaming becomes a thing
- BIDS? SSDT? SSDT-BI? Data tools? The confusion is maddening
- The what edition do I need game takes another twist
- SSMS is still treated like a stepchild
- Hekaton changes the game (If you meet this teeny tiny list of caveats)
- Azure is melted into the box product
- Someone put JSON in a database… AKA: Patient zero
- Dynamic data masking and Row level security brings all the auditors to the yard
- AlwaysEncrypted continues the pattern of “Always or Power” all the names!
- QueryStore appears and DBA’s the world over are immediately confused by this black magic
- Temporal tables appear and SQL Server catches up with postgres circa 2012
- SSMS starts getting major love, everyone wonders if its a trick
- SSDT is… wait…for…it… a SINGLE tool finally!
Future SQL Server
- Can we get indexes on
functionsexpressions *pretty please* If a mainframe can do it surely SQL can too
- Banish the message “string or binary data may be truncated” to the history books
- How about a tiny touch of love for Powershell (those guys yell loud enough that powershell can cure all my ills, how about giving them a proper hammer)
- An edition based on size of server, pay for the T-SQL features you use based on server/database size *things like hekaton on small db’s* in a new “AlwaysPower” edition
- *basic* Scalar UDF optimization to cache deterministic values
What did I miss? Wait, I know I missed a lot so dont answer that.
This months TSQL Tuesday #61 is hosted by Wayne Sheffield on Giving Back to the SQL Server Community. An interesting topic given the season and something that is near and dear. Here’s his original post http://blog.waynesheffield.com/wayne/archive/2014/12/t-sql-tuesday-61-giving-back/
It seems like I have a knack for doing things
backwards differently and my time giving to the SQL Community has largely followed that path! In 2015 Ill be continuing my “different” trend of community impact through PASS from international/national to local!
As some of you might know, I’ve spent a good portion of the last nine (or is it ten?) years involved with PASS in a number of different roles. From working on the precursor to our current Virtual Chapters to volunteering with the Program Committee for many years to serving on the nomcom multiple times, running a SQL Saturday and even being a Board Member. I have experience in almost every area of PASS. I have grown both personally and professionally during the time that I’ve spent in those roles. The overall communities growth in all these years always inspires me to do more and I don’t see this year being any different.
Even though I don’t hold an official role (at this moment in time) with PASS. I will still be giving back to the SQL Server community in 2015. I plan on starting a local user group in South Houston. I admit that I should have started this at least 6 months ago but I’ve put it off. 2015 will be the year the Houston area gets a second user group and if you want to attend or help, let me know! In addition, I will be helping to host a SQL Saturday in Houston early next summer. I will also commit now to making actual use of this SQL blog in 2015 instead of it being a landing page for what has to be the worlds largest amount of SPAM (thanks askimet!). Whenever there is a discussion on PASS and its future, you can be sure that I will be involved in it on some level. I hope to encourage others to volunteer as I have and foster a new group of leaders in the SQL Server community.
Thanks again to everyone who participates in the SQL Community, you all are really what its all about!
I’m running for another PASS elected position
This is a simple post to say that yes, you read the NomCom ballot correctly and YES, I’ve decided to run for the NomCom this year instead of the PASS Board of Directors. After last year and only having 3 candidates for the NomCom it was quite refreshing to see 11 for this years NomCom! I dont intend to spill much digital Ink about why you should vote for me since I spilled so much last year over the Board of Directors election.
None of this is to say that I’m uninterested or don’t want to get elected but, I don’t agree with the concept of an extremely active campaign to win a seat on the NomCom.
This months T-SQL Tuesday is about automation and I thought I’d write about extending existing automation. One of my favorite scripts for automation is Ola Hallengren’s Backup & Maintenance solution. Ola’s scripts are a fantastic way to automate highly configurable backups and maintenance on your SQL Server instances. If you’re not using them, you should seriously consider looking into why.
This solution serves as an outstanding base but like anything else its can be useful to tweak things a bit. Extending the initial automation provided by his scripts is what this post is all about.
In particular, I’ve modified Ola’s scripts to generate the files needed to restore all of the databases that have been backed up with his solution. In particular, having the ability to easily restore the whole server in the case of a disaster. Though, you could easily pull out one DB to only restore it. This script is currently only written for litespeed since that’s what I use for backups. However, it could easily be changed to support native backups or any of the other backup products that Ola’s scripts can be configured for. Perhaps Ill work on those in the future if it would be useful.
The idea is that every time you take a backup the backup job will create a .sql file on the server filesystem in the backup directory that can be used to restore to the point of the backups that were just taken.
This solution includes three pieces, an additional stored procedure, an additional step in both of the backup jobs to execute that stored procedure and lastly a step in the cleanup procedure to remove the restore scripts from the filesystem that have aged.
A couple of notes of caution:
As with anything you find on the internet, please use at your own risk in a development/test system and proceed with caution.
This script makes several assumptions including
- That you’ve installed Ola’s commands into the master database
- That you’re using litespeed
- That logging to the commandlog table is enabled
The stored procedure is relatively simple and accepts a single parameter @type “LOG” will generate the script as of the last log backup taken or for any other parameter, I happen to use “FULL”, it generates the script based on the last full backup.
CREATE PROCEDURE [dbo].[GenerateRestoreScript] (@type NCHAR(30) = 'LOG') AS DECLARE @ID INT DECLARE @DB NVARCHAR(128) SET NOCOUNT ON SELECT @ID = MAX(database_id) FROM sys.databases IF @type = 'LOG' SET @type = 'xp_backup_log' ELSE SET @type = '' --These intentionally not commented in the script as a precaution (to generate an error) SELECT 'ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-' SELECT 'PLEASE BACKUP THE TAIL OF THE LOG SELECT 'OTHERWISE THIS COULD BECOME A RGE (GOOGLE THE ACRONYM!)' SELECT 'IF YOU ARE OK REPLACING THE DB AND LOOSING DATA IN THE TAIL LOG RUN THIS SCRIPT WITHOUT THESE COMMENTS ' SELECT 'RAISERROR(N''ARE YOU SURE YOU WANT TO DO THIS?'', 25,1) WITH Log;' SELECT '--------' WHILE @ID > 2 BEGIN SELECT @DB = NAME FROM sys.databases WHERE database_id = @ID SELECT @ID = @ID - 1 SELECT '----' + @DB + '-----------------------------------------------------' SELECT 'EXECUTE ' + REPLACE(Command, '_backup_', '_restore_') + ', @filenumber = 1, @with = N''' + CASE WHEN rn <> 1 THEN 'NO' ELSE '' END + 'RECOVERY''' + CASE WHEN CommandType = 'xp_backup_database' THEN ', @with = N''REPLACE'';' ELSE ';' END FROM ( SELECT SUBSTRING(LEFT (Command, CHARINDEX(''', @with =',Command)),CHARINDEX('[master]',Command),LEN(Command)) AS Command , ROW_NUMBER() OVER (ORDER BY cl.ID DESC) AS rn , CommandType FROM [master].[dbo].[CommandLog] cl WHERE cl.DatabaseName = @DB AND (cl.CommandType = 'xp_backup_database' OR cl.CommandType = @type) AND cl.ID >= ( SELECT MAX(ID) FROM CommandLog c WHERE CommandType IN ( 'xp_backup_database' ) AND cl.DatabaseName = c.DatabaseName ) ) AS rntab ORDER BY rn DESC END
To execute the stored procedure, this needs to be added as an additional cmdexec job step to the Full backup job (make sure to change the directory where you want the .sql files stored (H:\SERVERNAME below))
sqlcmd -l 30 -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -y 0 -b -Q "EXEC [dbo].[GenerateRestoreScript] ''FULL''" –o”H:\SERVERNAME\DRFULL_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM))_RESTORE.sql" –w50000
To execute the stored procedure, this needs to be added as an additional cmdexec job step to the Transaction log backup job (make sure to change the directory where you want the .sql files stored (H:\SERVERNAME below))
sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -y 0 -b -Q "EXEC [dbo].[GenerateRestoreScript]" -o"H:\SERVERNAME\DRLOG_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM))_RESTORE.sql" –w50000
This cmdexec Job step that needs to be added to the output file cleanup job to clean up old .sql files (make sure to change the directory where the .sql files stored (H:\SERVERNAME below))
Note: currently this configuration keeps the files from the past 3 days but the actual files kept depends on when the cleanup job is scheduled.
cmd /q /c "For /F "tokens=1 delims=" %v In (''ForFiles /P "H:\SERVERNAME" /m *RESTORE.sql /d -3 2^>^&1'') do if EXIST "H:\SERVERNAME"\%v echo del "H:\SERVERNAME"\%v& del "H:\SERVERNAME"\%v"
I have these steps scripted into Ola’s original solution .sql so the folder names are set properly and job creation is completely automated. Ill leave that part of extending automation to you, dear reader, as homework.
Since SQL Server 2005 we’ve been able to grant one login the ability to impersonate another in order to have a different set of effective permissions. More than a few times I’ve found this paradigm useful in order to solve problems while adhering to the principal of least privilege and likewise more than a few times I’ve tried to find information about who has impersonation privileges in SSMS and have been severely disappointed to find that it is buried. This extremely simple script solves that problem and gives all of the information about who has impersonation rights on an instance.
If you want to read more about all the fantastic things you can do with EXECUTE AS and impersonation, MSDN is a good place to start
SELECT grantee_principal.name AS WhoCanImpersonate , grantee_principal.type_desc AS ImpersonatorType , sp.name AS WhoCanTheyImpersonate , sp.type_desc AS ImpersonateeLoginType FROM sys.server_permissions AS prmssn INNER JOIN sys.server_principals AS sp ON sp.principal_id = prmssn.major_id AND prmssn.class = 101 INNER JOIN sys.server_principals AS grantee_principal ON grantee_principal.principal_id = prmssn.grantee_principal_id WHERE prmssn.state = 'G'