Previously, I posted describing a query that can be used to determine the filename, datetime and LSN information for the last or any applied log. The following is a stored procedure that can be used as an alternative to the DTS method we've used whereby a single log is applied at any one time. This method allows successive log backups to be taken even though one may not have been applied.
In this method we create 2 jobs, the first on the principal server, the second on the standby, their purpose is to backup and restore logs independently of each other.
Job #1: Principal - Every 15 minutes create a log backup to a network share
Job #2 Standby - Every 5 minutes execute spLogShipRestorer
Depending on the size and activity of your database you may want to change the intervals between each job, on our live database that's creating 200MB logs every 15 minutes we've decreased the interval of backups to 5 minutes.
The spLogShipRestorer definition follows:
CREATE PROCEDURE spLogShipRestorer @dbname SYSNAME, @root SYSNAME
AS
BEGIN
CREATE TABLE #logs
(
logname SYSNAME NULL
)
SET NOCOUNT ON
DECLARE @file SYSNAME, @cmd NVARCHAR(4000), @sbyroot SYSNAME,
@backupfile SYSNAME, @globalerror INT
DECLARE @date DATETIME, @lastfile SYSNAME, @last_lsn NUMERIC(25,0)
-- set the default STANDBY file location
SELECT @sbyroot = 'D:\DATA\MSSQL\Data\'+ @dbname + 'UNDO'
-- set the previously applied logs details from
-- the system tables for the specified database
SELECT TOP 1
@date = restore_date,
@lastfile = (SELECT TOP 1 physical_device_name FROM msdb..backupmediafamily mf WHERE mf.media_set_id = bs.media_set_id),
@last_lsn = last_lsn
FROM msdb..restorehistory rh WITH(NOLOCK)
JOIN msdb..backupset bs WITH(NOLOCK)
ON bs.backup_set_id = rh.backup_set_id
WHERE rh.destination_database_name = @dbname
ORDER BY restore_date DESC
-- parse out the filename and display it
SELECT @lastfile = right(@lastfile, LEN(@lastfile) - LEN(@root))
SELECT @lastfile AS LastFileRestored
-- create a console command to list backup files only
-- DIR :Order By Name: :No Header Information: :Archive Attribute:
SELECT @cmd = 'dir ' + @root + 'LOG_' + @dbname + '_*.bak /ON /B /AA'
-- execute the command and store results in a table
INSERT INTO #logs
EXEC master..xp_cmdshell @cmd
-- clean up the crap from the table created
-- by the DIR command, leaving only valid files
DELETE FROM #logs
WHERE logname IS NULL
OR logname <= @lastfile OR logname LIKE '%file not found%' -- if there is no logs to apply then quit IF (SELECT COUNT(*) FROM #logs) = 0 BEGIN SELECT 'Nothing To Do: No Logs To Restore' RETURN 0 END -- otherwise loop thru the #logs table -- and apply each log in order DECLARE C CURSOR LOCAL FAST_FORWARD FOR SELECT logname FROM #logs ORDER BY logname OPEN C FETCH NEXT FROM C INTO @file WHILE @@FETCH_STATUS = 0 BEGIN -- build the full filename SELECT @backupfile = @root + @file -- display some information PRINT 'Procesing: '+ @file -- now restore the log RESTORE LOG @dbname FROM DISK = @backupfile WITH RESTART, NORECOVERY -- buffer the error code SET @globalerror = @@ERROR -- do some error checking/handling IF @globalerror <> 0
BEGIN
SELECT 'Error: ' + STR(@globalerror) as Error
BREAK
END
FETCH NEXT FROM C INTO @file
END
-- clean up the cursor
CLOSE C
DEALLOCATE C
-- set the database into standby
RESTORE DATABASE @dbname WITH STANDBY = @sbyroot
END
The only downsides of this method come from having to manually apply the first log after a full or differential backup has been applied, or when the log chain is broken. But it does mean you have all the log backups retained so you do not need to perform a full database restore to fix it.
The next next post will demonstrate how to determine which log needs to be applied next