Free Space Monitoring Tool
By Bill Wunder
One of the know problems with SQL
Server 7 and SQL 2000 up to and including SP3 is that you can’t
always depend on auto grow to allocate more space to data
devices when necessary. I have described one such horror
story where this happened to me in the previous article:
The Database that
Wouldn’t Grow.
I don’t know of anything in the
provided SQL Client tool set that will tell me how much free
space a database device has other than the Enterprise Manager (EM) task pad for a
database. I wanted to find a better way to keep track of when a
device was getting low on free space and I wanted to automate
the process of being informed of the need to expand a database.
In the back of my mind I even wanted to be able to expand a
device before I hit the point of critical failure once I was
sure the monitoring capabilities were solid. I built a
Distributed Management Objects (DMO) based
utility that will capture the values that EM uses, store them
in a table, and send me an email when the available free space
for any file drops below a threshold that I have defined. Once
I'm comfortable that the utility is always accurate with it's
emails I may look into adding the ability to auto-grow using DMO
if that seems worthwhile. I'm actually hopeful that for the most
part my databases maintain stable sizes and I won't need
to extend enough files to justify the automation of growth
within my utility.
I decided to use a DTS ActiveX task
to loop through all the data devices in all database to extract
the DMO info I needed and, using ADO and a stored procedure,
insert that info to a table. Once I had all the data in a table
I could easily query the table, again using that same ADO
connection and a stored procedure that will send me and email if
action is required, to determine if I had a device that needed
to be expanded. I would also be building a history of the growth
trends to aid in making more accurate decisions in the future.
Later I can enhance the ActiveX script to use the DMO connection
to actually expand a device when necessary.
Examine the lines from my ActiveX
script to notice that the data collected is based upon the DMO
DBFile object. You can find out everything you might want to
know about the DBFile object by looking in SQL Server Books on
Line in the DMO topic of the “Building SQL Server Applications”
chapter (or just by typing in “DBFile” in the BOL keyword
search).
' loop through all files in all databases in the DMO connected SQL Server
' and submit the file info to the ado connected Log to SQL Server
For Each oDB In oSQLServer.Databases
For Each oFileGroup In oDB.FileGroups
For Each oDBFile In oFileGroup.DBFiles
Database.Value = oDB.Name
FileGroup.Value = oFileGroup.Name
FileName.Value = oDBFile.Name
File.Value = oDBFile.Id
Path.Value = oDBFile.PhysicalName
IsPrimaryFile.Value = oDBFile.PrimaryFile
Size.Value = oDBFile.Size
SizeInKB.Value = oDBFile.SizeInKB
SpaceAvailableInMB.Value = oDBFile.SpaceAvailableInMB
' collapse now and recreate in the proc
If oDBFile.FileGrowthType = 0 Then
FileGrowth.Value = oDBFile.FileGrowthInKB
Else
FileGrowth.Value = oDBFile.FileGrowth
End If
FileGrowthType.Value = oDBFile.FileGrowthType
MaximumSize.Value = oDBFile.MaximumSize
' add a row to the history table for this file
adoCommand.Execute
Next 'File
Next 'FileGroup
Next 'Database
Notice also that I’m collecting all
the size properties of the DBFile object so I’ll want a table
with columns to store all data I'm collecting to enable some
historical trend analysis of file growth. I'll name this
table SpaceUsedHistory.
create table dbo.SpaceUsedHistory
([Server] varchar(128)
, [Database] varchar(128)
, [FileGroup] varchar(128)
, [FileName] varchar(128)
, [File] int
, [Path] varchar(128)
, [IsPrimaryFile] bit
, [Size] int
, [SizeInKB] float
, [SpaceAvailableInMB] int
, [FileGrowth] int
, [FileGrowthInKB] float
, [FileGrowthType] int
, [MaximumSize] int
, [RecCreatedDt] datetime
constraint dft_SpaceUsedHistory__RecCreatedDt default getdate()
, [RecCreatedUser] varchar(128)
constraint dft_SpaceUsedHistory__RecCreatedUser default suser_sname()
, constraint pk_SpaceUsedHistory__RecCreatedDt__Server__Database__FileGroup__FileName
primary key ([RecCreatedDt], [Server], [Database], [FileGroup], [FileName]))
The complete code for the tool can
be found in two scripts referenced below. One script contains the T-SQL components of
the tool: two stored procedures used and a script to create the
SQL Agent Job. The other contains the ActiveX VBScript used in
the DTS Package.
I’ll use the
stored procedure
SpaceUsedByFileGroupInsert to add the collected rows
to my table. So that every implementation of the tool is quick
and easy I'll always check for the existence of the table in
this stored procedure. If the table isn't found the procedure
will create it before it does the insert. And I'll use the
stored procedure
SpaceUsedByFileGroupLowFree to determine a critical
space condition and notify me when it detects a device needs to
grow up.
I built the DTS package to be
completely portable and saved it as a Structured Storage File on
a network share that all my SQL Servers can see. This way if I
need to modify or want to enhance the package I need make the
change in only one place and it will work everywhere. This also
helps me make sure my DTS packages are somewhere where they get
backed up regularly. I left logging enabled to the “(local)” SQL
Server so where ever it is executed I will be able to look in
msdb.dbo.sysdtspackageslog on that server to find the errors in
the event of a failure.
Side note: Try a query
like this to look at the failure messages that
logging generates for your
DTS packages. This will
show them for that last several days, the most
recent first. I use this
query (with all the
white space removed and the complete statement on a
single line) as one
of the Custom Shortcuts
in my Query Analyzer.
select char(10) + '###'
, s.starttime
, p.name + ' - ' + s.stepname
, s.errordescription
from msdb.dbo.sysdtssteplog s
join msdb..sysdtspackagelog p
on s.lineagefull=p.lineagefull
where s.starttime > getdate() - 8
and s.stepexecresult <> 0
order by s.starttime desc
To create the package right click on
the "Data Transformation Services" folder found by expanding the
level just under any SQL Server. Select "New Package" from the
context menu. Add an ActiveX Script Task by selecting the
"ActiveX Script Task" icon in the Tasks section of the toolbar.
Paste the
admin dts - Free Space Monitoring Tool ActiveX Script Task
script into the code box of the task and enter "Check free space
in all files" in the "description" input box at the top of the
ActiveX Script Task dialog. Click OK. The script is now part of
the package, but we still need to add a couple of Global
Variables to the package and enable logging before we actually
save the the package. Right click in the package background and
select "Package Properties" from the context menu. Navigate to
the Global Variables tab and enter the global variable "SQL
Server" and "Log to SQL Server" as shown in figure 1
figure 1
Now navigate to the Logging tab and
set logging to always go to the "(local)" SQL Server as shown in
figure 2.
figure 2
Save the package to a share that all
your SQL Servers can read using the package name "Monitor Free
Space" and the file Name "Monitor Free Space.dts".
Feel free to change those names as
you desire, but keep in mind that the job script I have provided
expects to find the package with this name in the file with this
name. Even though I wrote the package to be able to log to a
different server than it collects the DMO info from, I
implemented it to always log to the local table by creating a
SQL Agent Job that uses Agent Tokens rather than hard coded
server names so I can load that exact same job on any server and
it will do the correct actions provided I make sure the two
stored procedures and the table are in place on that server.
Here’s what the Job Step command needs to look like when using
tokens:
(Note that everything needs to be on
one line in you job, the SQLAgent CMDEXEC shell doesn’t like
carriage returns!)
dtsrun
/F"\\<share
name here>\Monitor Free Space.dts"
/N"Monitor Free Space"
/A"SQL Server:8=[SRVR]"
/A"Log to SQL Server:8=[SRVR]"
This is a good time to point out
that the script also expects to you add the correct path to the
location where you have saved this file to the job step before
you create the job.
You'll find the SQL
Agent Job Script at the
bottom of the sql script . Note that
I use a second job step for failure notification rather than
depending on the SQL Agent notification feature and SQL Mail. I
hope to describe this in more detail in a future article.
Whether to write the
SpaceUsedHistory data to a central server or to the local server
or to save it at all is a subjective preference. Some people
like to have a central spot, along the lines of the Multi Server
Administration capabilities of Enterprise Manager. I prefer to
keep historical tables and I like to keep the historical data
with the server it was collected from rather than logging to a
“master” server and then having to weed through a larger pool of
data to get my answers. Another advantage I see to using the
local server is that when ever the server I’m collecting the
data from is available to collect data I know that the server
where I’m storing my history is available to receive the data.
No issues of network connectivity or the master server being
down to contend with. You decide what's right for you. If you
want a central store, all you need to do is edit the job so that
"Log to SQL Server" Global Variable is hard coded to the location
of your central store rather than tokenized to always use the
local server.
With that you have a tool that will
monitor and track the space used and free space available for
every file in every data file group in every database on any SQL
Server. We also have the capability through the virtual table
master.dbo.sysperfinfo to monitor, track, and manage the space
used and available in log files. In a future article I’ll try to
say more about that aspect of managing space.
Bill
|