Archive for year 2014
This months TSQL Tuesday #61 is hosted by Wayne Sheffield on Giving Back to the SQL Server Community. An interesting topic given the season and something that is near and dear. Here’s his original post http://blog.waynesheffield.com/wayne/archive/2014/12/t-sql-tuesday-61-giving-back/
It seems like I have a knack for doing things
backwards differently and my time giving to the SQL Community has largely followed that path! In 2015 Ill be continuing my “different” trend of community impact through PASS from international/national to local!
As some of you might know, I’ve spent a good portion of the last nine (or is it ten?) years involved with PASS in a number of different roles. From working on the precursor to our current Virtual Chapters to volunteering with the Program Committee for many years to serving on the nomcom multiple times, running a SQL Saturday and even being a Board Member. I have experience in almost every area of PASS. I have grown both personally and professionally during the time that I’ve spent in those roles. The overall communities growth in all these years always inspires me to do more and I don’t see this year being any different.
Even though I don’t hold an official role (at this moment in time) with PASS. I will still be giving back to the SQL Server community in 2015. I plan on starting a local user group in South Houston. I admit that I should have started this at least 6 months ago but I’ve put it off. 2015 will be the year the Houston area gets a second user group and if you want to attend or help, let me know! In addition, I will be helping to host a SQL Saturday in Houston early next summer. I will also commit now to making actual use of this SQL blog in 2015 instead of it being a landing page for what has to be the worlds largest amount of SPAM (thanks askimet!). Whenever there is a discussion on PASS and its future, you can be sure that I will be involved in it on some level. I hope to encourage others to volunteer as I have and foster a new group of leaders in the SQL Server community.
Thanks again to everyone who participates in the SQL Community, you all are really what its all about!
I’m running for another PASS elected position
This is a simple post to say that yes, you read the NomCom ballot correctly and YES, I’ve decided to run for the NomCom this year instead of the PASS Board of Directors. After last year and only having 3 candidates for the NomCom it was quite refreshing to see 11 for this years NomCom! I dont intend to spill much digital Ink about why you should vote for me since I spilled so much last year over the Board of Directors election.
None of this is to say that I’m uninterested or don’t want to get elected but, I don’t agree with the concept of an extremely active campaign to win a seat on the NomCom.
This months T-SQL Tuesday is about automation and I thought I’d write about extending existing automation. One of my favorite scripts for automation is Ola Hallengren’s Backup & Maintenance solution. Ola’s scripts are a fantastic way to automate highly configurable backups and maintenance on your SQL Server instances. If you’re not using them, you should seriously consider looking into why.
This solution serves as an outstanding base but like anything else its can be useful to tweak things a bit. Extending the initial automation provided by his scripts is what this post is all about.
In particular, I’ve modified Ola’s scripts to generate the files needed to restore all of the databases that have been backed up with his solution. In particular, having the ability to easily restore the whole server in the case of a disaster. Though, you could easily pull out one DB to only restore it. This script is currently only written for litespeed since that’s what I use for backups. However, it could easily be changed to support native backups or any of the other backup products that Ola’s scripts can be configured for. Perhaps Ill work on those in the future if it would be useful.
The idea is that every time you take a backup the backup job will create a .sql file on the server filesystem in the backup directory that can be used to restore to the point of the backups that were just taken.
This solution includes three pieces, an additional stored procedure, an additional step in both of the backup jobs to execute that stored procedure and lastly a step in the cleanup procedure to remove the restore scripts from the filesystem that have aged.
A couple of notes of caution:
As with anything you find on the internet, please use at your own risk in a development/test system and proceed with caution.
This script makes several assumptions including
- That you’ve installed Ola’s commands into the master database
- That you’re using litespeed
- That logging to the commandlog table is enabled
The stored procedure is relatively simple and accepts a single parameter @type “LOG” will generate the script as of the last log backup taken or for any other parameter, I happen to use “FULL”, it generates the script based on the last full backup.
CREATE PROCEDURE [dbo].[GenerateRestoreScript] (@type NCHAR(30) = 'LOG') AS DECLARE @ID INT DECLARE @DB NVARCHAR(128) SET NOCOUNT ON SELECT @ID = MAX(database_id) FROM sys.databases IF @type = 'LOG' SET @type = 'xp_backup_log' ELSE SET @type = '' --These intentionally not commented in the script as a precaution (to generate an error) SELECT 'ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-ALERT-' SELECT 'PLEASE BACKUP THE TAIL OF THE LOG SELECT 'OTHERWISE THIS COULD BECOME A RGE (GOOGLE THE ACRONYM!)' SELECT 'IF YOU ARE OK REPLACING THE DB AND LOOSING DATA IN THE TAIL LOG RUN THIS SCRIPT WITHOUT THESE COMMENTS ' SELECT 'RAISERROR(N''ARE YOU SURE YOU WANT TO DO THIS?'', 25,1) WITH Log;' SELECT '--------' WHILE @ID > 2 BEGIN SELECT @DB = NAME FROM sys.databases WHERE database_id = @ID SELECT @ID = @ID - 1 SELECT '----' + @DB + '-----------------------------------------------------' SELECT 'EXECUTE ' + REPLACE(Command, '_backup_', '_restore_') + ', @filenumber = 1, @with = N''' + CASE WHEN rn <> 1 THEN 'NO' ELSE '' END + 'RECOVERY''' + CASE WHEN CommandType = 'xp_backup_database' THEN ', @with = N''REPLACE'';' ELSE ';' END FROM ( SELECT SUBSTRING(LEFT (Command, CHARINDEX(''', @with =',Command)),CHARINDEX('[master]',Command),LEN(Command)) AS Command , ROW_NUMBER() OVER (ORDER BY cl.ID DESC) AS rn , CommandType FROM [master].[dbo].[CommandLog] cl WHERE cl.DatabaseName = @DB AND (cl.CommandType = 'xp_backup_database' OR cl.CommandType = @type) AND cl.ID >= ( SELECT MAX(ID) FROM CommandLog c WHERE CommandType IN ( 'xp_backup_database' ) AND cl.DatabaseName = c.DatabaseName ) ) AS rntab ORDER BY rn DESC END
To execute the stored procedure, this needs to be added as an additional cmdexec job step to the Full backup job (make sure to change the directory where you want the .sql files stored (H:\SERVERNAME below))
sqlcmd -l 30 -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -y 0 -b -Q "EXEC [dbo].[GenerateRestoreScript] ''FULL''" –o”H:\SERVERNAME\DRFULL_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM))_RESTORE.sql" –w50000
To execute the stored procedure, this needs to be added as an additional cmdexec job step to the Transaction log backup job (make sure to change the directory where you want the .sql files stored (H:\SERVERNAME below))
sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -y 0 -b -Q "EXEC [dbo].[GenerateRestoreScript]" -o"H:\SERVERNAME\DRLOG_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM))_RESTORE.sql" –w50000
This cmdexec Job step that needs to be added to the output file cleanup job to clean up old .sql files (make sure to change the directory where the .sql files stored (H:\SERVERNAME below))
Note: currently this configuration keeps the files from the past 3 days but the actual files kept depends on when the cleanup job is scheduled.
cmd /q /c "For /F "tokens=1 delims=" %v In (''ForFiles /P "H:\SERVERNAME" /m *RESTORE.sql /d -3 2^>^&1'') do if EXIST "H:\SERVERNAME"\%v echo del "H:\SERVERNAME"\%v& del "H:\SERVERNAME"\%v"
I have these steps scripted into Ola’s original solution .sql so the folder names are set properly and job creation is completely automated. Ill leave that part of extending automation to you, dear reader, as homework.