Create SQL Snapshot Backup to Document Change Process

Let’s get the disclaimer out of the way right up front.  This article is not intended to be used in place of a Production grade backup solution.  It can be used in conjunction, with your regular backup rotation, but if you truly care about the database and its contents, please, refer to books on-line or read this Introduction to Backup and Restore Strategies in SQL Server.

If you already know all that and want a quick and dirty way to test database modifications or statements in a development environment…proceed. 

While not a developer by trade…there are circumstances which call for me to wear a different hat.  When that happens…I typically have to develop, and document, an entire change process.  No different than “a real developer”.  The steps are then submitted to a change review board, approved, scheduled, and applied to our production environment.

Going in, I have an idea how it will go.  As the process unfolds, undoubtedly something will occur, or reveal itself, that I hadn’t thought of and the documentation changes a bit…with each iteration.

Back in the day…when we had to walk both ways, uphill, in the snow, to school…we had to do a backup to disk, run through the changes, document the process, restore the database…and then do it all over again.  The backup and restore process, with a database of any size, can take a fair amount of time.

Enter “SQL Snapshot Backups” – While the dress rehearsals, today, haven’t changed much…the backup and restore process has.  Instead of it taking minutes or hours…taking a snapshot backup or reverting to a previous one takes seconds.  Since each run-thru is that much quicker…you can focus on getting the process and documentation right.

The syntax below can be used to create a Snapshot of “Adventureworks”.  A couple catches…the NAME is the logical name, of the data file, within the database and the path to the FILENAME, you specify, must exist.

–Create a database snapshot

CREATE
DATABASE  Adventureworks_snapshot 
ON  
(
 NAME = AdventureWorks_Data
, FILENAME = D:\Backup\Adventureworks_Snapshot’
) AS SNAPSHOT OFAdventureworks

 

The Snapshot should then be visible in the Database Snapshots folder: snapshot13

 

 

Once you have confirmed the Snapshot has been taken you can run through your modifications, document, and verify your process.  Should you need to, use the syntax below to revert back to the state the database was in when the snapshot was initially taken…and run through it again.

 – Reverting AdventureWorks to AdventureWorks_snapshot

USE
master; 
RESTORE DATABASEAdventureworks
from DATABASE_SNAPSHOT = Adventureworks_snapshot’;
GO

 

When you are comfortable with your results and have the change process fully documented you can use the following to delete the snapshot:

 

 –To Delete AdventureWorks Snapshot
DROP
DATABASE
Adventureworks_snapshot
 

The following is a link to an article that contains some Best Practices for Creating Database Snapshots if you would like more information.

SQL Queries That Identify Unused Indexes – Finders Keepers!

I’ve always found indexing, and performance tuning in general, extremely interesting.  Trying to understand how it all works…and works so well…has been a great part of the journey.  That whole “Is it an art or is it a science” thing…  

Well…I think I may have had some kind spiritual awakening or something here recently because part of the mystery of tuning and maintaining indexes has been revealed to me.

Okay…maybe they weren’t that kind of a revelations…but the the two queries below and the paths they led me down were very enlightening.  I am re-publishing them here, not to take credit for them, but because they, or some variation of them, will remain part of my toolkit.

—————————————————————-
This first query was posted by Jason Massie - a.k.a. StatisticsIO
—————————————————————-

SELECT objectname=OBJECT_NAME(s.OBJECT_ID)

,indexname=i.name

, i.index_id  

, reads=user_seeks + user_scans + user_lookups  

, writes =  user_updates  

, p.rows

FROM sys.dm_db_index_usage_stats s JOIN sys.indexes

ON i.index_id = s.index_id AND s.OBJECT_ID = i.OBJECT_ID  

JOIN sys.partitions p ON p.index_id = s.index_id AND s.OBJECT_ID = p.OBJECT_ID

WHERE OBJECTPROPERTY(s.OBJECT_ID, ‘IsUserTable’) = 1  

AND s.database_id = DB_ID()  

AND i.type_desc = ‘nonclustered’

AND i.is_primary_key = 0

AND i.is_unique_constraint = 0

AND p.rows > 10000

ORDER BY reads, rows DESC

 

Sums all the reads that have occurred, on an index, and then limits the result set to non-clustered indexes on tables with row counts greater than 10000.  You can quickly see which tables are heavily written to…but not read from.

 

—————————————————————-

The query below was posted by Michell Ufford - a.k.a SQLFool
—————————————————————-

DECLARE @dbid INT

    ,@dbName VARCHAR(100);

 

SELECT @dbid = DB_ID()

    ,@dbName = DB_NAME();

 

WITH partitionCTE (OBJECT_ID, index_id, row_count, partition_count)

AS

(

    SELECT [OBJECT_ID]

        , index_id

        , SUM([ROWS]) AS ‘row_count’

        , COUNT(partition_id) AS ‘partition_count’

    FROM sys.partitions

    GROUP BY [OBJECT_ID]

        , index_id

)

 

SELECT OBJECT_NAME(i.[OBJECT_ID]) AS objectName

        , i.name

        , CASE

            WHEN i.is_unique = 1

                THEN ‘UNIQUE ‘

            ELSE

          END + i.type_desc AS ‘indexType’

        ,ddius.user_seeks

        ,ddius.user_scans

        ,ddius.user_lookups

        ,ddius.user_updates

        ,cte.row_count

        , CASE WHEN partition_count > 1 THEN ‘yes’

            ELSE ‘no’ END AS ‘partitioned?’

        , CASE

            WHEN i.type = 2 And i.is_unique = 0

                THEN ‘Drop Index ‘ + i.name

                    + ‘ On ‘ +@dbName

                    + ‘.dbo.’ + OBJECT_NAME(ddius.[OBJECT_ID]) + ‘;’

            WHEN i.type = 2 And i.is_unique = 1

                THEN ‘Alter Table ‘ +@dbName

                    + ‘.dbo.’ + OBJECT_NAME(ddius.[OBJECT_ID])

                    + ‘ Drop Constraint ‘ + i.name + ‘;’

            ELSE

          END AS ‘SQL_DropStatement’

FROM sys.indexes AS i

INNER Join sys.dm_db_index_usage_stats ddius

    ON i.OBJECT_ID =ddius.OBJECT_ID

        And i.index_id =ddius.index_id

INNER Join partitionCTE AS cte

    ON i.OBJECT_ID =cte.OBJECT_ID

        And i.index_id =cte.index_id

WHERE ddius.database_id =@dbid

ORDER BY

    (ddius.user_seeks +ddius.user_scans +ddius.user_lookups) ASC

    , user_updates DESC;

 

Provides a more granular view.  Not only do you see if an index is being used…you see how it is being used.  Another cool feature is that the Drop or Alter Index statements are generated for you.

 

—————————————————————-

—————————————————————-

 

Both of the queries reference the sys.dm_db_index_usage_stats DMV, return similar results, and work great! 

 

Why use both?  You may or may not…but it’s not a bad idea to get a second opinion.  In addition, understanding the variations of how the DMV, and its results, can be used is what unlocks some of its mysteries. 

 

Jason’s query cuts to the chase.  Are the indexes, on tables of substantial size, being utilized?  You can adjust or add additional threshholds to suit your environment.  This would be a great query to put into a Reporting Services report to keep an eye on some of your heavy hitters.

 

Michelle’s query gives you that extra detail for a more in-depth look.  The fact that the DDL statements are generated is great for documenting any changes you make…this day and age that is critical.  I suppose you could take it a step further and generate the Create and Alter statements that would reverse those changes too.  Management likes peace of mind…

 

Now…if you haven’t subscribed to, and read, Jason Massie’s, and Michelle Ufford’s, blogs…go do it…the path to enlightenment is much shorter that way.

 

Copy SQL Server Client Component Configurations

I was recently given a new workstation.  Hooray! 

Since that happened, a number of the tools I use have to be re-installed and re-configured.  Booooo!

The most critical of them all, of course, the SQL Server 2005/2008 Client Components.  If you work in an environment with multiple Development, UAT,  and Production SQL Servers this can be a drag.  Fortunately, with SQL 2005, or with SQL 2008, some of it’s not too painful.  Being able to migrate from one installation to the next, quickly, makes it easier to share settings and tools with your fellow DBAs or to copy them to machines you commonly work from to give you a similar landscape. There are several items that I like being able to bring forward from one to the next:

re-configureimage11SQL Server Groups and Registrations

On the source machine, within SSMS, click “Ctrl+Alt+G” to view the Registered Servers. 

 Right click on “Database Engine” (Or any preferred sub-group) and select “Export…”

 

re-configureimage24Select the “Database Engine” to export all registered Servers or select an individual Server Group.  Provide the location and file name the export should be saved to and click Ok.

Warning:  If these are production server registrations, either leave the “Do not include user name and passwords…” option checked or make sure the export is kept in a secure location.

Copy the export file to your target machine(s) and “import” the Server Registrations in the same manner.

NOTE:  The Server Registration exports ARE NOT interchangeable between SQL 2005 and SQL 2008.

Template Explorer Customizations – I have used Template Explorer quite extensively to keep my most commonly used scripts at the ready.  I also maintain a collection of “troubleshooting” scripts that can be referenced in an emergency.  It helps with remaining calm, when others may be excited, knowing where to find the scripts you need under pressure.

If you have created Templates, and saved them, their default location will be:

C:\Documents and Settings\YourUserName\Application Data\Microsoft\Microsoft SQL Server\90\Tools\Shell\Templates\Sql

Simply copy the folders and templates from that location on your source machine to your target.

Recommendation:  Create one or two folders and then creating sub-folders within those to keep your personal scripts organized and separate from those provided.  It makes migrating or sharing them that much easier and reduces the likelyhood that any will be missed.  It’s cleaner too if you’re moving from SQL 2005 to SQL 2008.  I also have a “junk” folder (Like a drawer) for any scripts I’m working on that I’m not worried about keeping yet.

Solution Explorer - Ditto with Projects you may have created:

If you have created Projects, and saved them, their default location will be:

C:\Documents and Settings\YourUsername\My Documents\SQL Server Management Studio\Projects

Simply copy the folders and Solutions from that location on your source machine to your target.

SQL Server Configuration Alias- Just like the SQL Server Groups and Registrations, having to recreate the 15 different Alias’ that are defined within SQL Server Configuration Manager can be tedious.  Good news here is that they can be quickly exported and imported.  A bonus is that Configuration Alias’ ARE interchangeable between SQL 2005 and SQL 2008.

Warning:  The steps below involve making changes with your registry.  Proceed with caution…at your own risk :-)

Launch “Regedt32″ and browse to, and highlight, the following location: 

“HKey_Local_Machine\SOFTWARE\Microsoft\MSSQLServer\Client\ConnectTo”

re-configureimage53

 

From the menu select “File->Export”.  Provide the path and file name on your source machine.  Copy that file to your target machine and double-click on it.  It will ask you if you want to import it…say yes!

If you have other suggestions or features you migrate please let me know.

Using Select to Return Dynamic SQL Statements

I’ve always thought this was pretty slick…

We’ve all gone into Query Analyzer or SSMS and ran select, insert, or update  statements.  You write the statement, hit execute, and it returns the data or performs the action. 

What we’re doing here is no different except that we are actually going to use the SELECT statement to build the statements that we eventually want to capture and run.

The SELECT statement below simply returns the string you see between the single quotes:

dynamicsqlpic1-011420091

Lets take it a bit further.  The next image shows that we have modified the string, a bit, by adding a where clause to it.  We’ve also appended a variable (EmployeeID), a FROM clause, and a WHERE Clause to the end of the statement.  What we end up with are the three individual, select statements, shown, with the explicit EmployeeID dynamically tacked onto the end.

dynamicsqlpic2-01142009

You can then copy those statements into the SSMS window and execute them.

Admittedly, those examples are probably only useful as examples, but they illustrate a point.  We can use the SELECT statement to build, and more importantly SEE, the statements that we’re going to run.

Let’s make it an UPDATE statement: 

dynamicsqlpic3-01142009

Now, looking at the two resulting update statements, we can clearly see that we are about to give our two DBAs ten hours more vacation than anyone else in The Firm.  They clearly deserve it.  Simply run those two statements we’re off to Hawaii :-)

Sometimes being a DBA means covering your tracks.  Actually, that’s one of the prime directives.  Covering your tracks can mean taking backups, verifying proper permissions are in place, making sure maintenance jobs are running, etc.  In this case, covering your tracks can mean “Knowing, and being able to document, the statements and parameters that are about to be executed”.  If I go to the trouble to build the statement…I will typically save it, and the results, for reference.  It may come in handy in the future too.

 This technique can also be usefull when troubleshooting failed insert or update statements.  If you’re not dealing too many rows you can quickly scan them or run them in batches to narrow in on the culprit.   

The examples above use the “Adventureworks” database that Microsoft provides for practicing and tutorials.  It’s not installed by default (Like Northwind and Pubs used to be for SQL 2000) so if you have SQL 2005 installed and want the AdventureWorks database, to practice in, you can down load it here at Codeplex

Contributing Through Others…

We all know that with technology, and SQL Server in particular, it is a, never ending, educational journey.  Along the way I have encountered, solved, and forgotten how I solved, more issues than I care to admit.  That forgotten part may only apply to me :-) but that is part of why I started this blog. 

How many times have you wished you would have documented that?

One of the greatest assets a DBA can have is their toolkit.  What tools are in it may vary widely with experience, knowledge, and preference, but it is up to each of us to keep them sharp, organized, and to continue to look for new ones.  Going forward, and reaching as far back as I can, that’s what I’m doing here.

Build YOUR toolkit!

The title of this post is “Contributing Through Others…“.  That phrase comes from a talent developement model that we use at our firm.  It deals with “Leading and Developing Others”, “Mentoring” and being an “idea leader”.  While that’s one of the loooooooooong term goals of this blog…it’s not so much what this post is about. 

Today, this is more of a quick tribute to those that have mentored and contributed to and through me…

Re-Associate An Orphaned SQL User With SQL Login

If you ever have the need to move a SQL Server Database from one server or instance to the next then you may come accross orphaned SQL Users.  e.g.  The User is present in the database, but it is no longer associated with the SQL Login on the Server.

When that happens you have a couple of options.  You can delete the database user, re-add them, re-grant the appropriate role or rights, and be on your way.  It gets a bit more complicated, though, when the user owns schema(s) in the database.  If you try to delete and re-add them you’ll receive an error similar to the following:

“Drop failed for User ‘sqlusername’.  An exception occurred while executing a Transact-SQL statement or batch.  The database principal owns a schema in the database, and cannot be dropped.”

In order to get around that error, you would have to temporarily re-assign the owner of the schema(s) to another user in the DB before being able to delete their User Login. 

In addition to the schemas the user may own, they may have been added to roles within the database, they may even have been granted explicit permissions that you would then have to manually re-produce (If you know what they were).  If that’s the case, recreating those rights can be a meticulous task. 

A much better alternative is to execute a statement very similar to the one shown below:

EXEC sp_change_users_login ‘UPDATE_ONE’ , ‘sqlusername’, ‘sqlloginname’

  It will re-associate the SQL User with the SQL Login and the Schema(s), Roles, and Permissions are retained.

Upgrade Your Applications While Downgrading Your Stress

I sat in on the Pain-of-the-Week webcasts with Brent Ozar and Michael Lato today.  Thanks to both for their time and expertise!

This weeks feature was titled Upgrade Your Applications While Downgrading Your Stress and covered:

  • Rolling back failed upgrades
  • Migrating data when the schemas are incompatible
  • Managing the upgrade cycle when the applications/databases need to move in lock-step

Michael led off sharing some of his insights and approaches using native SQL Server  tools.  I really appreciated the different categories that he broke the types of upgrades into and the steps he takes to mitigate risks.  I’ve taken part in many upgrades and getting a high level view from someone else will only improve my own approach to the next.

Brent took over from there and quickly showed a few of the cool features that SQL Litespeed has built into it.  Like recovering individual objects from any recent backup or transaction log.  He also showed how you can view all changes, schema or data, who made them and when, and then generated the code that would reverse those changes.  Seemed to be pretty intuitive too.  If you haven’t seen the tool, are looking for something like it,  it’s worth the watch!